Pioneers of deep learning AI think its future is gonna be lit

  • “Yoshua Bengio, Geoffrey Hinton, and Yann LeCun
  • Titled “Deep Learning for AI,” the paper envisions a future in which deep learning models can learn with little or no help from humans, are flexible to changes in their environment, and can solve a wide range of reflexive and cognitive problems.
  • One example is the Transformer, a neural network architecture that has been at the heart of language models such as OpenAI’s GPT-3 and Google’s Meena. One of the benefits of Transformers is their capability to learn without the need for labeled data. Transformers can develop representations”

Here’s what a trend-analyzing A.I. thinks will be the next big thing in tech

“Virtual and augmented reality. 3D printing. Natural language processing. Deep learning. The smart home. Driverless vehicles. Biometric technology. Genetically modified organisms. Brain-computer interfaces.

These, in descending order, are the top 10 most-invested-in emerging technologies in the United States, as ranked by number of deals. If you want to get a sense of which technologies will be shaping our future in the years to come, this probably isn’t a bad starting point.”

Artificial Intelligence Vs Machine Learning Vs Deep Learning: What exactly is the difference ?

  • “Artificial Intelligence focuses on performing 3 cognitive skills just like a human – learning, reasoning, and self-correction.
  • Artificial Narrow Intelligence – systems are designed and trained to complete one specific task and are often termed as Weak AI / Narrow AI.
  • Artificial General Intelligence is when the AI systems/machines would perform on par with another human. This also means the ability of the machine to interpret and understand human tone and emotions and act accordingly. This is also called Strong AI
  • Artificial Super Intelligence/Super AI is when an Artificial Intelligent machine would become self-aware and surpass human’s intelligence and ability.
  • Machine Learning is naturally a subset of AI. It provides the statistical methods and algorithms and enables the machines/computers to learn automatically from their previous experiences and data and allows the program to change its behavior accordingly.
  • Deep Learning can be thought of as the evolution of Machine Learning which takes inspiration from the functioning of the human brain. Deep Learning is used to solve complex problems where the data is huge, diverse, less structured. Deep learning models are built on top of Artificial Neural Networks, which mimic how the human brain works.”

Artificial Intelligence Vs Machine Learning Vs Deep Learning: What exactly is the difference ?

Forget coding, you can now solve your AI problems with Excel

  • “While I’ve been using Excel’s mathematical tools for years, I didn’t come to appreciate its use for learning and applying data science and machine learning until I picked up Learn Data Mining Through Excel: A Step-by-Step Approach for Understanding Machine Learning Methods by Hong Zhou.
  • There’s a chapter that delves into the meticulous creation of deep learning models. First, you’ll create a single layer artificial neural network with less than a dozen parameters. Then you’ll expand on the concept to create a deep learning model with hidden layers.
  • In the last chapter, you’ll create a rudimentary natural language processing (NLP) application, using Excel to create a sentiment analysis machine learning model. You’ll use formulas to create a “bag of words” model, preprocess and tokenize hotel reviews and classify them based on the density of positive and negative keywords.”

3 tech trends that COVID-19 will accelerate in 2021

Image Credit: DKosig/Getty Images

“The question is, how should companies focus their resources in 2021 to prepare for this changed reality and the new technologies on the horizon? Here are three trends that I predict will see massive attention in 2021 and beyond.

  1. AI must become practical
  2. Solutions become more autonomous with deep learning
  3. Promise of curing future pandemics will accelerate research in quantum computing”