Artificial Intelligence and Machine Learning Blogs
Explore AI and ML blogs. Discover use cases, advancements, and the transformative potential of AI for businesses. Stay informed of trends and applications.
cancel
Showing results for 
Search instead for 
Did you mean: 
mark_ma
Product and Topic Expert
Product and Topic Expert
219


The Thirty-Fourth annual meeting of AAAI was held in New York. Both the conference committee and participants have demonstrated great resilience. More than a thousand of AI researchers from academia and industry participated. And the conference committee also live-streamed major remarks online, and still displayed the posters, for those who were not able to travel to the venue.

Contents were delivered in forms of Tutorials, Workshops, Technical Sessions, Talks, and Events

There were a broad range of topics been delivered:

  • Natural Language Processing (NLP): Entity Recognition and Linking, Machine Translation, .etc.

  • Computer Vision, 3D, Synthesis and Generation, Object Detection, Vision + Language, Video, .etc.

  • Machine Learning Theory studies, including areas in Interpretation, Online Learning, Neural Net Theory - Models and Algorithms, Supervised Learning, Unsupervised and Semi-supervised Learning, Reinforcement Learning, Multiagent Learning, Casual Learning and Bayes Net, Ethics: Fairness and Privacy, .etc.

  • Optimization: Constraint Satisfaction, Game Theory, .etc.

  • Robotics, Planning

  • Applications, including areas in Customer Support and Marketing, Web Search & Ranking & Recommendation, Trade and Finance, Medical Imaging and Health, Science, .etc.


Some remarks:

  1. ACM Turing Award Winner talks:





  • Yann LeCun: Self-supervised Learning



Self-supervised learning is one of the promising ways to get machines to learn massive amounts of background knowledge about how the world works by observation in a task-independent way, like animals and humans.




  • Yoshua Bengio: Deep Learning for AI


"There is no completely general intelligence — these are always some inductive bais and priors."

He noted that, Machine learning bypasses this problem by allowing the computer to acquire that knowledge from data, observations and interactions with an environment. Neural networks and deep learning are machine learning methods inspired by the brain in which information is not represented by symbolic statements but instead where oncepts have distributed representations, patterns of activations of features which can overlap across concepts, making it possible to quickly generalize to new concepts. More recently, deep learning has gone beyond its traditional realm of pattern recognition over vectors or images and expanded into many selfsupervised methods and generative models able to capture complex multimodal distributions, into models with attention which can process graphs and sets, leading to breakthroughs in speech recognition and synthesis, computer vision and machine translation.


  • Geoffrey Hinton: Stacked Capsule Autoencoders


Capsule Network was first introduced by him in NIPS 2017.The idea is to represent the segments of a given image using a group of neurons and then representing the image using these learned sub representations to identify the image.

      2. Some emerging areas ( Time Series, ML in Games, .etc) which have large business values have drawn researchers' attention.

 

To conclude, AAAI 2020 was an unparalleled learning experience, a great chance to attend the prestigious researchers in the field, an opportunity to witness and celebrate the achievement has been done so far in this area.

 

acknowledgment:

  • special gratitude to funding: SAP DevX program


references:

  • LiveStreamed Talks from AAAI-20: link

  • AAAI2020 accepted papers: link