Using MindForth for machine translation

1.6.3 For demonstration of machine translation (MT).

formatting link
Although for current versions of the original, unported MindForth machine translation still lies in the future, the MT capability is inherently there, embedded in the cognitive architecture of the AI. Mind.Forth has three separate layers of mental activity involved in the generation of sentences of thought in natural human language. The top layer is the auditory memory array, which indiscriminately holds quasi-phonetic memory engrams of words in all the languages known by the AI -- all mixed in together and separated only by the word-fetch access tags. The words all have in common that they are composed of sounds, regardless of what human language they belong to. In an AI Mind that speaks both English and German, it is the job of the linguistic superstructure for English to keep track of all the English auditory engrams, and it is the job of the German syntax superstructure to keep track of all the German auditory engrams -- with some overlap of German terms like Kindergarten and Gesundheit being used in English, and English words like manager and software being used in German.

The middle layer of linguistic generation is the semantic memory holding the lexical array of word-tags as vocabulary items for each known language. Although the AI Forthmind might know both English and German, only the fetch-tags for the words as stored in audition, but not the actual words, are stored in the lexical array -- which easily functions as if it were two separate arrays, one for English and one for German, because access to English words and access to German words are strictly separated by the associative tags coming up from conceptual thinking and the fetch- tags going up to the actual words stored in the auditory memory.

The bottom layer of linguistic generation is the deep conceptual array of pre-verbal concepts interacting on spreading activation and not yet finding expression as components of thought in any particular language. Although English or German syntax may guide the formation of thought, the same thought that occurs under the control of one language may bubble up as a translated thought in any other available language, insofar as the AI Mind contains adequate vocabulary and grammar to express itself easily in more than one language.

When the AI Forthmind proceeds to add German to its repertoire of known human languages, it will be possible to activate either English or German in the AI Mind simply by speaking (entering input) to the AI in either English or German. Since the AI will know its English words and its German words only by means of associative tags, the same tags will, as it were, vote for which human language to activate in the mind of the computer or robot. Simply saying one German word like Gesundheit may not launch the German-speaking mechanism in the AI Mind, but entering a whole sentence of nothing but German words should get the cumulative idea across that it is time to think and speak in German.

It is not yet time for MindForth to claim a machine translation ability off-the-shelf, so to speak, because MindForth first needs to grow larger in memory size, in availability of syntactic modules, and in bootstrap vocabulary for both English and German and for any other human language being requested for purposes of machine translation. The AI may also need to grow larger in terms of the subtle interactions of subconscious concepts, so that nuances of thought in one language may survive translation into another language. But the writing is on the wall and on the Web -- Mind.Forth AI is your good bet for future developments in intelligent machine translation.

As time goes by, MindForth or its derivatives may be expected to grow into AI software that starts out as a blank slate where language is concerned, with the ability to learn human languages in the same way as a human baby learns language. In the meantime, there may be some hybrid AI Minds which are pre-programmed to speak English or German or whatever, but which also have syntactic concatenation routines (easy to program in Forth on the basis of nodal activation levels) that permit the AI to learn new syntactic sequences, either beyond what the AI already knows in one language, or in the acquisition of a human language previously unknown to the AI, or even a made-up language like Esperanto or Lojban. For any known human language, living or dead, there may eventually be at least one AI Mind (if not a parliament of Minds) tasked with knowing that human language superbly well so as to serve as an ultimate authority.

--

formatting link

Reply to
mentifex
Loading thread data ...

Computers cant do A.I. , if they must translate to English or German , too slow ..

GUI ( non-text) , allows computer to go 10,000 times faster .

There are no grammar rules , no arbitrary definitions to slow

it down ....

Embest , the moronic Chineese company is building a EVB using the

STR710FZ2 ....internal mem is 65KB SRAM , 256KB FLASH , 144 pin chip .

about $180 ...

I got their LPC2292 , but it has much less SRAM , so its slower by

50% .
Reply to
werty

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.