Pages

Thursday, February 17, 2022

Notes on Communication 170222

 36000 is the number of visual messages human eye can register in one hour. 15000 is the number of words the average adult can read per hour. So visual communication is better.


NLG, a subfield of artificial intelligence (AI), is a software process that automatically transforms data into plain-English content. The technology can actually tell a story – exactly like that of a human analyst – by writing the sentences and paragraphs for you. NLG is one of the fastest growing technologies being adopted in the enterprise. There many use-cases for NLG, but where it is seen to be most effective is when deployed to automate time-intensive data analysis and reporting activities.
Source- https://narrativescience.com/resource/blog/what-is-natural-language-generation/

Narrow AI is created to solve one given problem, for example, a chatbot. Artificial general intelligence (AGI) is a theoretical application of generalized artificial intelligence into any domain, solving any problem that requires AI.
Source- https://levity.ai/blog/general-ai-vs-narrow-ai#:~:text=What's%20the%20difference%20between%20narrow,any%20problem%20that%20requires%20AI.

Moments of epiphany tend to come in the unlikeliest of circumstances. For Ian Goodfellow, PhD in machine learning, it came while discussing artificial intelligence with friends at a Montreal pub one late night in 2014. What came out of that fateful meeting was “generative adversarial network” or (GAN), an innovation that AI experts have described as the “coolest idea in deep learning in the last 20 years.”
Source- https://bdtechtalks.com/2018/05/28/generative-adversarial-networks-artificial-intelligence-ian-goodfellow/


Goodfellow’s friends were discussing how to use AI to create photos that looked realistic. The problem they faced was that current AI techniques and architectures, deep learning algorithms and deep neural networks, are good at classifying images, but not very good at creating new ones.

Goodfellow came up with the idea of a new technique in which different neural networks challenged each other to learn to create and improve new content in a recursive process. That same night, he coded and tested his idea and it worked. With the help of fellow scholars and alums from his alma mater, Université de Montréal, Goodfellow later completed and compiled his work into a famous and highly-cited whitepaper titled “Generative Adversarial Nets.”

4 Vs of Data. Volume, Velocity, Variety and Veracity
.

No comments:

Post a Comment