Deep Learning
Deep Learning is
a sub-field of machine learning concerned with algorithms inspired by the
structure and function of the brain called artificial neural networks.
If you are just starting out in the field of deep
learning or you some experiences with neural networks, you may be
confused or maybe not understand many things in it. I can understand your
confusions initially and so were many of my friends and colleagues who learned
and used neural networks in the 1990s and
early 2000s.
The leaders and experts in this field have much
ideas about deep learning and these specific and nuanced perspectives shed a
lot’s of deep information in deep learning is all about.
In this post you will know about what is deep learning and
what are relations between Machine Learning(ML) and Artificial Intelligence(AI).
Deep Learning has evolved hand-in-hand with the digital era, which has brought about an explosion of data in all forms and from every region of the world. This data, known as big data, is drawn from sources like social media, internet search engines, e-commerce platforms, and online cinemas, among others. This enormous amount of data is readily accessible and can be shared through fintech applications like cloud computing.
However, the data, which
normally is unstructured, is so vast that it could take decades for humans to
comprehend it and extract relevant information. Companies realize the
incredible potential that can result from unraveling this wealth of information and are
increasingly adapting to AI system for automated support.
🔥Important: Deep learning unravels huge amount of unstructured data
that would normally take humans decades to understand and process.
Deep Learning Vs Machine Learning
Let us
start with the basics-What is Machine Learning and What is Deep Learning. If
you already know this, feel free to move to section 2.
What is Deep Learning
"Deep Learning is a particular kind of machine learning that achieves great power and flexibility by learning to represent the world as nested hierarchy of concept, with each concept defined in relation to simpler concept, and more abstract representation computed in the term of less abstract once."
Now – that one would be confusing. Let us break it with simple example
Example 1- Shape detection
Let me
start with a simple example which explains how things happen at conceptual
level. Let us try and understand how we recognize a square form other shapes.

The first
thing our eyes do is check whether there are 4 lines associated with a figure
or not (simple concept).If we find 4 lines, we further check, if they are
connected, closed, perpendicular and they are equal as well (nested hierarchy
of concept).
So, we took
a complex task (identifying a square) and broke it in simple less abstract
tasks. Deep Learning essentially does this at a large scale.
Example 2—Cat Vs Dog
Let’s take
an example of an animal recognize, where our system has recognize whether the
given image of a cat or a dog.
If we solve
this as a typical machine learning problem, we will define features such as if
the animals has whiskers or not, if the animal has ears & if yes, then if they
are pointed. In short, we will define the facial features and let the system
identify which features are more important in classifying a particular animal.
Now, deep
learning takes this one step ahead. Deep learning we had to manually give the features.
Deep learning works as follows:
Deep
learning works as follows:
It first identifies what are the edges that are most relevant to find out a Cat or a Dog
It then builds on this hierarchically of find what combination of shapes and edges we can find. For example, whether whiskers are present, or whether ears are present, etc.
After consecutive hierarchical identification of complex concepts, it then decides which of this features are responsible for finding the answer.
Example 1 – Machine Learning – Predicting weights based on height
Let us say you want to create a system which tells expected weight based
on height of a person. There could be several reasons why some thing like this
could be of interest. You can use this to filter out any possible frauds or
data capturing errors. The first thing you do is collect data. Let us say this
is how your data looks like:
Each point on the graph represents one data point. To start with we can
draw a simple line to predict weight based on height. For example a simple
line:
Weight
(in kg) = Height (in cm) - 100
can help us make predictions. While the line does a decent job, we need
to understand its performance. In this case, we can say that we want to reduce
the difference between the Predictions and actual. That is our way to measure
performance.
Further, the more data points we collect (Experience), the better will
our model become. We can also improve our model by adding more variables (e.g.
Gender) and creating different prediction lines for them.
Example 2 – Storm prediction System
Let us take slightly more complex example. Suppose you are building a
storm prediction system. You are given the data of all the storms which have
occurred in the past, along with the weather conditions three months before the
occurrence of these storms.
Consider this, if we were to manually build a storm prediction system,
what do we have to do?
We have to first scour through all the data and find patterns in this
data. Our task is to search which conditions lead to a storm.
We can either model conditions like – if the temperature is greater than
40-degree celsius, humidity is in the range 80 to 100, etc. And feed these
‘features’ manually to our system.
Or else, we can make our system understand from the data what
will be the appropriate values for these features.
Now to find these values, you would go through all the previous data and
try to predict if there will be a storm or not. Based on the values of the
features set by our system, we evaluate how the system performs, viz how many
times the system correctly predicts the occurrence of a storm. We can
further iterate the above step multiple times, giving performance as
feedback to our system.
Let’s take our formal definition and try to define our storm prediction
system: Our task ‘T’ here is to find what are the atmospheric conditions that
would set off a storm. Performance ‘P’ would be, of all the conditions provided
to the system, how many times will it correctly predict a storm. And experience
‘E’ would be the reiterations of our system.
Fast Fact:
"Electronics maker Panasonic has been working with universities and research centers to develop deep learning technologies related to computer vision."
A Deep Learning Example
Using the fraud
detection system mentioned above with machine learning, one can create a deep
learning example. If the machine learning system created a model with
parameters built around the number of dollars a user sends or receives, the
deep-learning method can start building on the results offered by machine
learning.
Each
layer of its neural network builds on its previous layer with added data like a
retailer, sender, user, social media event, credit score, IP address, and a
host of other features that may take years to connect together if processed by
a human being. Deep learning algorithms are trained to not just create patterns
from all transactions, but also know when a pattern is signaling the need for a
fraudulent investigation. The final layer relays a signal to an analyst who may
freeze the user’s account until all pending investigations are finalized.
You can read my other blogs on Artificial Intelligence




This information was very useful for me thank you
ReplyDelete