Ian Goodfellow- Founder of GANs
- Pranshu Aggarwal
- Jan 20, 2021
- 3 min read
Ian Goodfellow (born in 1985) is a researcher currently researching in machine learning. He is currently working in Apple. He was taught by Andrew Ng at Stanford University. He has also worked at Google Brain and made many contributions in the field of machine learning. His biggest contribution is GANs (Generative adversarial network) in 2014. In GANs there are 2 players which play against each other. Both the players are neural networks. They both try to give the worst input to the other. One is called the discriminator and the other is called the generator. In this process both the neural networks are trained without the help of a human being. Arithmetic operations can also be used with GANs. In a seminar in 2016 Yann LeCun described GANs as the coolest idea in machine learning. The biggest benefit of GANs is that it doesn’t require a human to operate it. There are many applications of GANs :1. it can be used to create photos of imaginary models without the need of hiring models , cameraman, etc. ; 2. It can be used to improve astronomical images. They were even successfully used in 2019 to model the dark matter; 3. They are also used to improve the pixels in video games such as final fantasy 8, final fantasy 9, etc. They can also be used to show how a person would look at different ages, visualize the effect of climate change on specific houses, reconstruct the image of a person while listening to his audio, etc. Overall GANs is a very cool process to train the neural network with ease. In 2017 Goodfellow was cited in MIT Technology Review’s 35 innovators under 35. In 2019 he was also included in the Foreign Policy’s list of 100 global thinkers. His early works includes Google maps in which he developed a system enabling Google maps to automatically transcribe addresses from photos taken by street view cars.. He is also regarded as the youngest and the most respected AI researchers in the world. Google Maps machine learning wasn’t an easy task. Many people have house numbers that are written in stylish ways which are difficult to detect for AI. This consists of a input variable X which after passing through the hidden layer gives 6 outputs: 1. The number of digits(called L); 2.th digits(S1,S2,S3,S4,S5). The accuracy rate of this neural network is about 95.6% which is very high. Even solving this problem wasn’t enough. Google has millions of house number photos so it is not possible for a human to search for the house number he wants to and the task is time consuming. This is common in Japan and South Korea where the houses are not numbered in order. Google solved the problem by making this process automatic with the help of AI. Thus the Google maps which we now use to find address has a long history behind it. Another great project of Goodfellow is Maxout network. He defined a simple new model called maxout designed to facilitate optimization by drop out and improve the accuracy of dropout’s fast approximate model averaging technique. Before we go ahead lets discuss about dropout. Dropout refers to the process of ignoring certain units in input and hidden layer of the neural network so as to reduce the time taken and to maximize the accuracy. The primary purpose of using dropout is to prevent over fitting. You would be surprised to know that dropout roughly doubles the number of iteration; however, training time for each epoch is less than before. We use dropout and maxout to demonstrate the state of art classification performance. Having discussed so much lets describe maxout. As the name suggests it gives the maximum of the given output and the dropout reduces the number of output and reduces the time taken. He has done miraculous work in the field of Machine Learning and we hope that he will continue to do so.

Comments