AI on steroids: Much bigger neural nets to come with new hardware, say Bengio, Hinton, and LeCun

  • “LeCun predicted they new hardware would lead to “much bigger neural nets with sparse activations,” and he and Bengio both emphasized there is an interest in doing the same amount of work with less energy.
  • ‘There are one trillion synapses in a cubic centimeter of the brain,” he noted. “If there is such a thing as General AI, it would probably require one trillion synapses.'”

A New Type of AI Has Been Created Inspired by the Human Brain

I’ve been researching the key findings and data on AI for a few years. With that context, this post strikes me as one of the more important findings; yet, it was written in such a scientific, vanilla fashion that most people won’t stop to process the details.

“The new paper demonstrates how ultra-fast learning rates are surprisingly identical for networks that are large as well as small.

So ‘the disadvantage of the complicated brain’s learning scheme is actually an advantage,’ the researchers say.

Here’s a related post…

How Google DeepMind is learning like a child: DeepMind uses videos to teach itself about the world

“Just like DeepMind taught an AI to interpret its surroundings via Symbol-Concept Association Network, the team leading this DeepMind project is following a similar path.

This method of learning is almost exactly like how humans think and learn to understand the world around them.”

What is the Difference Between AI and Machine Learning

Understanding the definitions and differences of ML and AI can be daunting, so I’m always on the look-out for well written content. This post from does a really nice job, and I have attempted to further summarize below. Check out their site for the full post.

“AI is the concept in which machine makes smart decisions whereas Machine Learning is a sub-field of AI which makes decisions while learning patterns from the input data.

Artificial Intelligence is about acquiring knowledge and applying them to ensure success instead of accuracy.

The Four types of Artificial Intelligence are:

  • Reactive AI
    • Lacks historical data
    • Completely reacts to a certain action
    • Reinforcement learning where a prize is awarded for any successful action and penalized vice versa
  • Limited Memory
    • Past data is kept adding to the memory

    Theory of Mind

    • Yet to be built as it involves dealing with human emotions, and psychology


    • The future advancement of AI
    • Machines could be conscious, and super-intelligent

    Three of the most common usage of AI

    1. Computer Vision such as Face Recognition
    2. Natural Language Processing like Amazon’s Alexa or Apple’s Siri
    3. Self-driving cars

  • What is Machine Learning?
  • Machine Learning is a state-of-the-art subset of Artificial Intelligence which let machines learn from past data, and make accurate predictions.
  • In Machine Learning, the concept of neural networks plays a significant role in allowing the system to learn from themselves.
  • Machine Learning is mostly about acquiring knowledge and maintaining better accuracy instead of success.
  • A sub-field of Machine Learning is Deep Learning. However, Deep Learning requires enormous computational power and works best with a massive amount of data. It uses neural networks whose architecture is similar to the human brain.
  • Machine Learning could be subdivided into three categories –
  • Supervised Learning
    • Both the input feature and the corresponding target variable is present
  • Unsupervised Learning
    • Only the input features are present
      The algorithms need to find patterns
  • Reinforcement Learning
    • Rewarded with a prize for every correct move and penalized for every incorrect move”

    9 Artificial Intelligence Trends You Should Keep An Eye On In 2019

    “Whether you like AI or not, if you are interested in what AI has in store for the future, then you are at the right place. In this article, we will look at some of the biggest AI trends that will dominate in 2019.

    1) AI Enabled Chips Will Go Mainstream

    2) AI and IoT Meet At The Edge

    3) Say “Hello” To AutoML

    • One of the biggest trend that will dominate the AI industry in 2019 would be automated machine learning (AutoML).

    4) Welcome to AIOps

    • DevOps will be replaced by AIOps and it will enable your IT department staff to conduct precise root cause analysis.

    5) Neural Network Integration

    • Tech giants such as Microsoft and Facebook are already working on developing an Open Neural Network Exchange (ONNX)

    6) Specialized AI Systems Becomes a Reality

    7) AI Skills Will Decide Your Fate

    • “Most organizations want to embrace AI as part of their digital transformation but do not have the developers, AI experts, and linguists to develop their own or to even train the engines of pre-built solutions to deliver on the promise.”

    8) AI Will Get into Wrong Hands

    9) AI Powered Digital Transformation

    Dr. Tung Bui, Chairman of IT department and professor at University of Hawaii said, ‘Unlike most of the predictions and discussions about how autonomous vehicles and robots will eventually affect the job market — this is true but will take time due to institutional, political, and social reasons — I contend that the biggest trend in AI will be an acceleration in the digital transformation, making existing business systems smarter.'”

    The Deep Learning Framework Backed By Facebook Is Getting Industry’s Attention

    Neural NetworksSOURCE: PIXABAY

    “When it comes to deep learning frameworks, TensorFlow is one of the most preferred toolkits. However, one framework that is fast becoming the favorite of developers and data scientists is PyTorch.

    PyTorch is an open source project from Facebook which is used extensively within the company.

    PyTorch focuses on simplicity and accessibility. It can be used by a diverse set of users ranging from researchers to academicians to a developer. PyTorch uses a technique known as dynamic computation that makes it easy to train neural networks. TensorFlow is based on static computation that executes the code only after the graph of operations is generated.”