If you are into learning languages, take a look at the Glossika startup. I believe it offers the most modern and efficient method for becoming fluent in a language. Its founder, Michael Campbell, was very kind to explain to me that his research led to the following conclusions:
1. Many methods start by teaching grammar rules, but this is wrong. Native speakers come to school when they already “own” the language to some degree.
2. Many methods teach you words, then let you compose sentences, and this also is wrong and won’t lead you to fluency. Native speakers hear and learn complete phrases.
If you recognize the above approaches in the books that you used to study a language, and that left you frustrated, then now you know why. So read on. Michael’s method does the opposite: it first lets you master a few (well, a thousand) phrases. Then some grammar is already ingrained in you, and you can absorb more. You also get comfortable with the structure of basic sentences. After that, you are free to apply variations by adding words to your vocabulary.
In his introduction to the course, Michael points out that you should not discard any means of learning the language, but rather should use as many of them as you can obtain. In my case, I have some knowledge of about ten languages, and I always used Assimil and Pimsleur, (finding all others wanting), and I will keep them, but nothing beats Glossika for becoming fluent.
What I found fascinating though is that in machine translation they followed a similar path. Initially, machine learning was based on rules, but it never gave sufficient accuracy. Moderns methods are based on statistics, and especially on statistics based on billions of documents. The word is analyzed in its context, and statistical vectors are built from that word together with other words in a paragraph. This is called paragraph vectors, as described by Quoc Le and Tomas Mikolov. To quote from the authors: For example, the word vectors can be used to answer analogy questions using simple vector algebra: “King” – “man” + “woman” = “Queen”. It is also possible to learn a linear matrix to translate words and phrases between languages.
This says that the meaning of each word corresponds very closely to its position in a vector space created by the algorithm. It also means that once you have the basic sentence-to-sentence translations, you can translate a variation by interpolating between the translations of the known phrases.
Now one is struck by two similarities: the progression from rules-based to statistics-based translation is parallel to item 1. in Glossika’s approach; the translation of sentences by interpolating the basic ones parallels item 2. The modern way of learning languages and the way Google does automatic translation are similar!
Here is my explanation: it is natural. The best linguists like this one have figured out their ways of learning languages, and they are similar to Glossika’s. And the machine translation methods are based on neural networks, which imitate human brains. So experts from both fields have converged on similar approaches because these approaches lead to the best results.
What is next? According to Mikolov in a podcast with Radim Řehůřek, previous results were all child’s play. The fun starts when you create machines that learn by themselves. But according to Ray Kurzweil, the future is in merging the computer and the human brain. This was started today by Elon Musk who created a company for embedding computers (in the form of microelectrodes) into a human brain. It was predicted by Vladimir Sorokin, a Russian writer who described the world in which computer “fleas” are embedded into humans (for a price) in his latest novel “Manaraga” which came out in March of this year. You will have to learn Russian – no translation yet.