Here’s a list of 5 articles that caught our eye in the AI community this week. Check it out!
1 – Ever wondered how to title your post?
You’re not the only one! Your title alone can be the difference between a success and a miss. Also, sometimes it’s hard to hit the nail on the head and not sound like a clickbait. Fortunately, the team at Buzzsumo just published an extensive study on what works and what doesn’t so that you stop spending more time on the title than the actual post. Head over to their page for more information on the research and practical advice!
2 – Are GPUs really the solution to deep learning?
Well, it would appear that if you’re on a budget (or not) that might not be the case anymore. According to a recent benchmark, training on CPUs is slower but also way more cost-effective! And thanks to the cloud, where you can use thousands of them with little to no hustle, this could make a big difference at the end of the month.
3 – Deep learning in every device?
Okay fine, but Léo, what if I don’t have access to the internet or a nearby server? It’s no news that AI, and specifically deep learning, requires a massive amount of computation, not only at training time but also when used in production. Alas, sometimes all you have access to is a raspberry pi or a smaller device which can pretty much close the door for any real application. But good news! Microsoft is trying really hard to bridge the gap and make deep learning accessible to all devices! Researchers are indeed working on systems that can run machine learning algorithms on microcontrollers as small as the one pictured.
4 – DeepMind going after Element.ai ?
Recently after Element.ai raised $102M to develop an AI platform led by Yoshua Bengio in Montreal, DeepMind announced they are expanding to Canada too! Are we witnessing the birth of a new rivalry? East-coast or West-coast: time to choose your team!
5 – Meow Generator
Speaking of Canada, we will end This week in AI with a cat generator coming right from the maple leaf country! I can hear you from over here. Sure, it’s not the first time a cat generator has been built. What’s interesting with this one is that it compares different GAN architectures and, more precisely, it is the first real use of the Self-Normalizing Linear Unit that we’ve seen. Head over to Alexia’s website to read all about it!
This post is the first of its kind, but definitely not the last. Stay tuned if you want to read our future weekly digest!
🔔 Shameless plug alert! Want to exploit the latest advances in deep learning technology in a production environment without having to go through all the hoops and spend months implementing everything? Take a look at our solution to focus on your expertise and leave the rest to us!