es Español

«Artificial Intelligence & BIG DATA». Culture and Language. by Pedro R. García Barreno

"While there may be debate whether AI will transform our world in good or evil ways, something we all agree on is that AI would be nothing without big data. Big data and AI are considered two giants."
Listening to, understanding a sound, is a task that humans perform effortlessly on a daily basis, says Angela D. Friederici, who also emphasizes that phonological, syntactic and semantic information must be processed in a coordinated way and within a millisecond window. This requires a neurobiological model of the faculty of language, until now the identity of the human species.

Artificial Intelligence (AI) is one of the most transformative forces of our times. While there may be debate whether AI will transform our world in good or evil ways, something we all agree on is that AI would be nothing without big data. Big data and AI are considered two giants. Machine learning is considered as an advanced version of AI through which smart computers can send or receive data and learn new concepts by analyzing the data without human assistance. The Large Hadron Collider, for example, will generate about 15 petabytes of data per year. That's nothing compared to what happens when we map a whole brain, which will involve about a million petabytes of data. Astronomy, chemistry, climate studies, genetics, law, materials science, neurobiology, network theory, or particle theory are just a few areas already being transformed by large databases. Now this revolution is coming to the humanities. Google's massive book program, which has digitized millions of books, has spun off an application that gives researches access to a database of billions of words across several language set and two centuries: “big-and-long data”. Google's program - Ngram Viewer - does more than provide a unique look at the history of words. It promises to change how historians do their work and to change our picture of history itself. A new kind of scope - big data - is going to change the humanities, transform the social science, and renegotiate the relationship between the world commerce and the “ivory tower. In parallel, cognitive architecture plays a vital role in providing blueprints for building intelligent systems supporting a broad range of capabilities similar to those of humans. Neural network architecture for learning word vectors can train more than 100 billion words in a day. A Neural Machine Translation (NMT) translates between multiple languages, and NMT can also learn to perform implicit bridging between language pair never seen explicitly during training, showing that transfer learning and zero-shot translation is possible for neural translation. A novel training framework - deep reinforcement learning (RL) to end-to-end learn in a completely ungrounded synthetic world, where the agents communicate via symbols with no pre-specified meanings - for visually-grounded dialog agents showed that two bots invent their own communication protocol without any human supervision (tabula rasa?). RL agents not only significantly outperform supervised learning agents, but learn to play to each other's strengths, all the while remaining interpretable to outside humans observers. Bot-talk remembers twins-talk, post-structuralist novel or languages ​​culturally constrained. AI languages ​​can be evolved starting from a natural human language, or can be created ab initio.

Pedro R. García Barreno, MD, Ph.D., MBA.
of the Royal Spanish Academy
of the Royal Academy of Sciences of Spain
of the Scientific Committee of FIDE

If the article has been interesting to you,

We invite you to share it on Social Networks

Twitter
LinkedIn
Facebook
Email

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Contact

Fill out the form and someone from our team will contact you shortly.