Loading in 2 Seconds...
Loading in 2 Seconds...
A Buyer’s Guide to the Localization Standards Landscape. Localization World Silicon Valley 11 October 2011. Session Agenda. Experts. Session goals. Entice Interest in understanding Educate Awareness of issues and possibilities Encourage Insight into local applicability
Localization World Silicon Valley11 October 2011
Entice Interest in understanding
Educate Awareness of issues and possibilities
Encourage Insight into local applicability
Engage Impact on global business performance
Program Charter, Process, & Timeline
DF as the liaison officer on behalf of XLIFF TC
XLIFF – XML Localization Interchange File Format
OAXAL – Open Architecture for XML Authoring and Localization (Reference Model) (TC)
MLW-LT – MultilingualWeb – Language Technology
W3C – World Wide Web consortium
OASIS – Organization for the Advancement of Structured Information Standards
ULI – Unicode Localization Interoperability (TC)
EITHER Breadth OR Depth
EITHER Normative Processing Requirements OR Informal Recommendations
EITHER Publish minimal core quickly OR try to address long tail of feature requests
EITHER improved functionality OR backwards compatibility
The 1.x standard is too complex
The 1.x standard has too generous extensibility
The 1.x standard lacks explicit conformance criteria
The overall goal is to ensure interoperability throughout Language Technology related content transformations during the whole content lifecycle.
Although the XLIFF 1.x standard was intended primarily as an exchange format the industry practice shows that the defined format is also suitable for storage and legacy content leverage purposes.
Make processing requirements integral part of the spec as normative, obligatory part of each element (including attributes) spec
Strict Process for Feature inclusion in 2.x
Core – Basic part of the specification that contains all and only substantial elements that cannot possibly be excluded without negatively affecting the standard’s capability to allow for basic language technology related transformations. [ongoing discussion on this concept, DavidF will work on deriving this concept from main success scenario rather than the vague notion of a basic LT transformation]
15 Voting Members! And counting..
Heavy Hitters: Yves Savourel (ENLASO), Rodolfo Raya (Maxprograms), Bryan Schnabel
Traditional contributors: SAP, SDL, LRC, PSBT
New Entrants: GALA, Multicorpora, Tom Commerford
Rejoined TC recently: IBM, LIOX
On their way: Oracle, Kilgray, Welocalize, TAUS
Interested: Atril, Microsoft, Wordbee
XLIFF is an open standard: TRANSPARENT AND RF
CSA – Coordination and Support Action
W3C – Worldwide Web Consortium
WG – Working Group (in W3C)
Deep Web, Surface Web
LSP – Language Service Provider
TM, MT, TMS
OASIS DITA, XLIFF
We want your logo here
Deep Web is mostly XML and is being managed by CMS, ideally CCMS.
Cocomore is involved in Drupal and Sharepoint based CMS and CCMS solutions
Passing process, terminology, and translatability metadata from CCMS onto down stream localisation chain actors
Ensure that relevant Deep Web metadata will resurface in the rendered HTML, so that real time MT services can make use of them to improve their output
Again, translatability or terminology metadata will be passed onto MT to improve results
Improve MT training through passing domain and processing related metadata
This will allow for rapid creation of relevant training corpora, excluding ufront out-of-domain content, raw MT output etc.
* Does not apply for ISG LIS