This week in AI #1: 5 articles that will make you smarter
1- Ever wondered how to title your post?
You’re not the only one! Your title alone can be the difference between a success and a miss; plus it’s sometime pretty hard to hit the nail on the head and not sound like a clickbait. Fortunately, the team at Buzzsumo just published an extensive study on what works and what doesn’t so that you stop spending more time on the title than the actual post! Head over to their page for more information on the research and practical advice.
2- Are GPUs really the solution to Deep Learning?
Well, it would appear that if you’re on a budget (or not) that might not be the case anymore. According to a recent benchmark, training on CPUs is slower but also way more cost-effective! And in a world where you can use thousands of them with little to no hustle thanks to the cloud this could make a big difference in your wallet at the end of the month.
3- DeepLearning in every device?
Okay fine, but Leo, what if I don’t have access to the internet or a nearby server? It’s no news that AI and specifically Deep Learning requires massive amount of computation, not only at training time but also when it’s being used in production. Alas sometimes all you have access to is a raspberry pi or a smaller device which can pretty much close the door for any real application. But good news! Microsoft is trying really hard to bridge the gap and make Deep Learning accessible to all devices!
4- DeepMind going after Element.ai ?
Recently after Element.ai raised $102M to develop an AI platform led by Yoshua Bengio in Montreal, DeepMind just announced that they are expending to Canada too! Are we witnessing the birth of a new rivalry? East-coast, or West-coast, time to chose your team!
5- Meow Generator
Speaking of Canada, we will end this week in AI with a cat generator coming right from the Maple Leaf Country! I can hear you from over here, sure it’s not the first-time a cat-generator’s been built but what’s interesting with this one is that it compares different GAN architectures and specifically is the first real use of the Self-Normalizing Linear Unit that we’ve seen. Head over to Alexia’s website to read all about it!
This post is the first of its kind but definitely not the last; keep tuned if you want to read our future weekly digest!
🔔 Shameless plug alert: Want to exploit the latest advances in Deep Learning technology in a production environment without having to go through all the hoops and spend months implementing everything? Take a look at our solution to focus on your expertise and leave the rest to us!