You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We need a way to translate fields of Mapeo records to the device language, e.g. Preset and Field record types could have multi-lingual names, labels, hints etc.
The proposed Translations API #466 allows for storage of translated fields. This issue is how to translate fields when reading records.
Option A: Translate in the front-end. When reading a record, for each field, use the translation API to lookup a translation.
Option B: Translate in the back-end. Front end can pass a lang option when reading a record, and the backend will return the record with translated fields.
I think Option B is far preferable, because Option A adds a lot of complexity to front-end code, and will also have a performance impact because of multiple requests to the backend. Language matching is also complicated, so Option B keeps this logic in the backend, and also allows for optimization by caching which records and fields have translations, to avoid unnecessary lookups.
Implementation idea
Add an extra paramater opts.lang to the DataType read methods (dataType.getByDocId(), dataType.getByVersionId and dataType.getMany()), which is a IETF BCP-47 language string (using opts.lang to mirror the usage of the lang tag in HTML).
In the Translation API, keep an in-memory cache of which record types and which field refs have been parsed, as a performance optimization, e.g. $translation.translations: Map<MapeoDoc['schemaName'], Set<FieldRef>>. This should be ok to be in-memory because it is limited to the number of record types x number of text fields per record, and will likely be much smaller.
Within the DataType read methods, because this is potentially an expensive operation, first check whether there are any indexed translations for the record type ($translation.translations.has(recordType)), then for each field in the read data, check if there are any translations indexed for that field ref ($translation.translations.get(recordType).has(fieldRef)).
Map the primary language subtag to ISO 639-3 equivalent (it could be ISO 639-1 or (unlikely) ISO 639-2)
Try to read the translation $translation.get({ languageCode, recordType, recordId, fieldRef }) (return array)
If more than one item is in the array, match the closest region
Replace field value with translation before returning.
Questions
Should we support an array for opts.lang, since a user could have multiple preferred languages e.g. a user could have their primary language as Quechua, but Spanish might be a suitable fallback, rather than the default of English.
I think this is maybe a follow-up issue for post-MVP.
How do we handle writes? Should writes always be in English, and then translations written separately via the API, or should write methods also take opts.lang and internally write to the translation API?
Since we don't currently have any UI for writing any of the docs that we would translate (Presets, Fields), then I think the former is ok - write in English, then manually write translations - because this can be done within the config import code.
Tasks
[ ]
The text was updated successfully, but these errors were encountered:
Description
We need a way to translate fields of Mapeo records to the device language, e.g. Preset and Field record types could have multi-lingual names, labels, hints etc.
The proposed Translations API #466 allows for storage of translated fields. This issue is how to translate fields when reading records.
Option A: Translate in the front-end. When reading a record, for each field, use the translation API to lookup a translation.
Option B: Translate in the back-end. Front end can pass a
lang
option when reading a record, and the backend will return the record with translated fields.I think Option B is far preferable, because Option A adds a lot of complexity to front-end code, and will also have a performance impact because of multiple requests to the backend. Language matching is also complicated, so Option B keeps this logic in the backend, and also allows for optimization by caching which records and fields have translations, to avoid unnecessary lookups.
Implementation idea
Add an extra paramater
opts.lang
to the DataType read methods (dataType.getByDocId()
,dataType.getByVersionId
anddataType.getMany()
), which is a IETF BCP-47 language string (usingopts.lang
to mirror the usage of thelang
tag in HTML).In the Translation API, keep an in-memory cache of which record types and which field refs have been parsed, as a performance optimization, e.g.
$translation.translations: Map<MapeoDoc['schemaName'], Set<FieldRef>>
. This should be ok to be in-memory because it is limited to the number of record types x number of text fields per record, and will likely be much smaller.Within the DataType read methods, because this is potentially an expensive operation, first check whether there are any indexed translations for the record type (
$translation.translations.has(recordType)
), then for each field in the read data, check if there are any translations indexed for that field ref ($translation.translations.get(recordType).has(fieldRef)
).If a translation might exist, then we look it up:
opts.lang
using https://github.com/wooorm/bcp-47 (can be memoized, because this will likely always be the same)$translation.get({ languageCode, recordType, recordId, fieldRef })
(return array)Questions
opts.lang
, since a user could have multiple preferred languages e.g. a user could have their primary language as Quechua, but Spanish might be a suitable fallback, rather than the default of English.I think this is maybe a follow-up issue for post-MVP.
opts.lang
and internally write to the translation API?Since we don't currently have any UI for writing any of the docs that we would translate (Presets, Fields), then I think the former is ok - write in English, then manually write translations - because this can be done within the config import code.
Tasks
The text was updated successfully, but these errors were encountered: