One of my favorite learning resources for gaining an understanding for the mathematics behind deep learning is “Math for Deep Learning” by Ronald T. Kneusel from No Starch Press. If you’re interested in getting quickly up to speed with how deep learning algorithms work at a basic level, then this is the book for you.
Video Highlights: Yann LeCun and Andrew Ng: “AI Doomers” and Why the 6-month AI Pause is a Bad Idea
In this Video Highlights feature, two respected industry luminaries, Andrew Ng and Yann LeCun, they discuss the proposal of a 6-month moratorium on generative AI. The discussion offers reasonable perspectives for how generative AI has turned the world on edge.
Video Higlights: Humanizing Digital Banking with Generative AI
In the video presentation below, our friends over at DeepBrain explore how video synthesis technology can be used to bring a human touch to digital banking.
NVIDIA Brings Generative AI to World’s Enterprises With Cloud Services for Creating Large Language and Visual Models
To accelerate enterprise adoption of generative AI, NVIDIA announced a set of cloud services that enable businesses to build, refine and operate custom large language models and generative AI models that are trained with their own proprietary data and created for their unique domain-specific tasks.
Video Highlights: GPT-4 Developer Livestream
Here is Greg Brockman, President and Co-Founder of OpenAI, for a March 14, 2023 developer demo showcasing GPT-4 and some of its capabilities/limitations. Included are a number of very compelling new use case capabilities over the previous GPT-3.5 version.
Don’t overlook independence in Responsible AI
In this contributed article, Dr Stuart Battersby, Chief Technology Officer of Chatterbox Labs, has the aim to raise awareness of a key issue in the field of Responsible AI (aka Ethical AI or Trustworthy AI), and that is the issue of independence.
Lightning AI Releases PyTorch Lightning 2.0 and a New Open Source Library for Lightweight Scaling of Machine Learning Models
Lightning AI, the company accelerating the development of an AI-powered world, today announced the general availability of PyTorch Lightning 2.0, the company’s flagship open source AI framework used by more than 10,000 organizations to quickly and cost-efficiently train and scale machine learning models. The new release introduces a stable API, offers a host of powerful features with a smaller footprint, and is easier to read and debug.
Data Science Bows Before Prompt Engineering and Few Shot Learning
In this contributed article, editorial consultant Jelani Harper takes a new look at the GPT phenomenon by exploring how prompt engineering (stores, databases) coupled with few shot learning can constitute a significant adjunct to traditional data science.
Infographic: Is AI the Next Gold Rush?
Our friends over at writerbuddy.ai analyzed over 10,000 AI companies and their funding data between 2015 and 2023. The data was collected from CrunchBase, NetBase Quid, S&P Capital IQ, and NFX. Corporate AI investment has risen consistently to the tune of billions.
Research Highlights: SparseGPT: Prune LLMs Accurately in One-Shot
A new research paper shows that large-scale generative pretrained transformer (GPT) family models can be pruned to at least 50% sparsity in one-shot, without any retraining, at minimal loss of accuracy. This is achieved via a new pruning method called SparseGPT, specifically designed to work efficiently and accurately on massive GPT-family models.











