Erscheinungsdatum: 10.06.2018, Medium: Taschenbuch, Einband: Kartoniert / Broschiert, Titel: Website localization Standard Requirements, Autor: Blokdyk, Gerardus, Verlag: 5STARCooks, Sprache: Englisch, Schlagworte: BUSINESS & ECONOMICS // General, Rubrik: Wirtschaft // Allgemeines, Lexika, Geschichte, Seiten: 126, Informationen: Paperback, Gewicht: 195 gr, Verkäufer: averdo
Erscheinungsdatum: 05/2008, Medium: Taschenbuch, Einband: Kartoniert / Broschiert, Titel: Towards Localization of Anglicisms, Titelzusatz: A Data-driven Aanalysis of Anglicisms on the Best Western Italia Website, Autor: Timofeeva, Ekaterina, Verlag: VDM Verlag, Sprache: Englisch, Rubrik: Sprachwissenschaft // Allg. u. vergl. Sprachwiss., Seiten: 176, Informationen: Paperback, Gewicht: 278 gr, Verkäufer: averdo
Towards Localization of Anglicisms ab 67.99 € als Taschenbuch: A Data-driven Aanalysis of Anglicisms on the Best Western Italia Website. Aus dem Bereich: Bücher, English, International, Gebundene Ausgaben,
The study investigates the ways in which the use ofanglicisms on the localized promotional websites of globalcompanies may be beneficial for the companies. The study suggeststhat anglicisms may be used in a promotional text as a strategicdevice, and localization of anglicisms is defined as part of thelocalization process which requires the adaptation of thestylistic, grammatical, and sociolinguistic aspects of anglicismsto the target language. The data came from the Best Western Italiawebsite, and the study focuses on the use of anglicisms in Italian.The data were collected in the period 18.12.2006-23.12.2006, andthe material for the present study consists of 48 web pages thatcontain 10 785 words, 490 of which are anglicisms, and,consequently, anglicisms make 4.5% of the total number of words.The study performs data-driven analysis of anglicisms that meansthat the methods were determined by the data, and thosecharacteristics of anglicisms which were more frequentlyencountered in the corpus of the present study were analysed. Thus,the present study combines quantitative and qualitative methods.Firstly, the frequency and word class distribution of anglicismswere identified. Secondly, semantic and etymological analyses ofanglicisms were performed. Thirdly, grammatical analysis ofanglicisms was conducted that focused on the word formationprocesses, the assignation of gender, the plural formation, wordorder and agreement of noun phrases, verbs, adverbs, spelling, andfalse anglicisms. The results of the present study suggest that theuse of anglicisms on a localized promotional website may help todesignate the target audience of the website, economize space on awebsite by using shorter English words, avoid the ambiguity causedby homonyms, create an authentic atmosphere of internationalenvironment, and enrich the user?s vocabulary with new vocabularyitems and new concepts. However, it is necessary to acknowledgethat the effectiveness of such a marketing strategy depends oncertain characteristics of the target market, and the usage ofanglicisms might be very popular among a particular speechcommunity, but ineffective and harmful when considering a broaderpopulation.
The present book is aimed to shed light on the steps taken by translators in translating the News into Persian for the localized version of the website. Intending to do a Critical Discourse Analysis (CDA) on the findings of the comparison of the website localization project, the researcher chose the highly visited website from the news genre, namely Euronews, which is supposed to bear ideological load. To achieve the objective, the original and localized versions of the website was probed in inconsecutive days, within this period, the News on the English and Persian websites, having the same textual content, and containing some ideological implications in their translations, were investigated. The textual differences between the original and localized versions of the website were extracted and analyzed critically based on Farahzad's (2011) three dimensional CDA model and Newmark's (1988) classification of translation procedures.
"If a website is usable in one country it is not necessary usable in another country, for that it is essential to consider the user and his/her culture who needs to use the website". Website users around the world belong to different cultures. Members of a particular culture share a common lifestyle, and to some extent they have their own way of viewing, thinking, understanding and doing things. This book explains the relationship between websites and the anthropologists' cultural dimensions in terms of web usability. Five different groups of anthropological dimensions of culture and cultural markers are proposed: E-culture, Stable, Broad, Variable, and Vista. The five groups are organized as levels in a pyramid, and in this way they allow for different degrees of website localization: from 1 (little localization) to 5 (full localization). Next, this pyramid is formally represented in a conceptual data model, the "Cultural Conceptual Model (C2M)", using Object Role Modelling (ORM). Based on the findings, a software tool called the Localized Website Design Advisor, or "LWDA", was built to dynamically generate localized website specifications and guidelines.
This tutorial text gives a unifying perspective on machine learning by covering both probabilistic and deterministic approaches -which are based on optimization techniques - together with the Bayesian inference approach, whose essence lies in the use of a hierarchy of probabilistic models. The book presents the major machine learning methods as they have been developed in different disciplines, such as statistics, statistical and adaptive signal processing and computer science. Focusing on the physical reasoning behind the mathematics, all the various methods and techniques are explained in depth, supported by examples and problems, giving an invaluable resource to the student and researcher for understanding and applying machine learning concepts.The book builds carefully from the basic classical methods to the most recent trends, with chapters written to be as self-contained as possible, making the text suitable for different courses: pattern recognition, statistical/adaptive signal processing, statistical/Bayesian learning, as well as short courses on sparse modeling, deep learning, and probabilistic graphical models.All major classical techniques: Mean/Least-Squares regression and filtering, Kalman filtering, stochastic approximation and online learning, Bayesian classification, decision trees, logistic regression and boosting methods.The latest trends: Sparsity, convex analysis and optimization, online distributed algorithms, learning in RKH spaces, Bayesian inference, graphical and hidden Markov models, particle filtering, deep learning, dictionary learning and latent variables modeling.Case studies - protein folding prediction, optical character recognition, text authorship identification, fMRI data analysis, change point detection, hyperspectral image unmixing, target localization, channel equalization and echo cancellation, show how the theory can be applied.MATLAB code for all the main algorithms are available on an accompanying website, enabling the reader to experiment with the code.