10th Indian Delegation to Dubai, Gitex & Expand North Star – World’s Largest Startup Investor Connect
All News

Mamba is Here to Mark the End of Transformers

Researchers Albert Gu and Tri Dao from Carnegie Mellon and Together AI have introduced a groundbreaking model named Mamba, challenging the prevailing dominance of Transformer-based architectures in deep learning. 

Their research unveils Mamba as a state-space model (SSM) that demonstrates superior performance across various modalities, including language, audio, and genomics. For example, the researchers tried their language modelling with the Mamba-3B model that outperformed Transformers based models of the same size and matched Transformers twice its size, both in pretraining and downstream evaluation.

Click here to check out the GitHub repository.

Mamba is presented as a state-of-the-art model with linear-time scaling, ultra-long context, and remarkable efficiency, outperforming Transformers in tasks it has been tested on.

The model is built on the foundation of structured state space models (SSMs), showcasing promising results in information-dense data scenarios, particularly excelling in language modelling where previous subquadratic models have fallen short compared to Transformers.

The researchers emphasise Mamba’s efficiency through its selective SSM layer, designed to address the computational inefficiency of Transformers on long sequences up to a massive million sequence length, which is a major limitation in Transformers.

I’m always excited by new attempts to dethrone transformers. We need more of these. Kudos to @tri_dao & @_albertgu for pushing on alternative sequence architectures for many years now. https://t.co/cf67Xa2PBS

— Jim Fan (@DrJimFan) December 4, 2023

Installation details for Mamba include the use of causal Conv1d layers and the core Mamba package, with additional requirements such as Linux, NVIDIA GPU, PyTorch 1.12+, and CUDA 11.6+.

Mamba’s versatility is demonstrated through its integration into an end-to-end neural network, offering a comprehensive language model with varying model dimensions and layers.

Mamba provides pretrained models with different parameters and layers, showcasing its adaptability to various tasks and data sizes. Evaluations of Mamba’s performance involve running zero-shot evaluations using the lm-evaluation-harness library, with comparisons against other models such as EleutherAI’s pythia-160m.

The research identifies a key weakness in existing subquadratic-time architectures, attributing it to their inability to perform content-based reasoning. Mamba addresses this weakness by allowing selective propagation or forgetting of information along the sequence length dimension, demonstrating significant improvements over traditional models.

Despite the shift away from efficient convolutions, Mamba employs a hardware-aware parallel algorithm in recurrent mode, resulting in fast inference and linear scaling in sequence length.

Mamba emerges as a compelling contender challenging the Transformer paradigm, demonstrating superior performance in diverse modalities and promising advancements in the field of deep learning.

The post Mamba is Here to Mark the End of Transformers appeared first on Analytics India Magazine.

by Siliconluxembourg

Would-be entrepreneurs have an extra helping hand from Luxembourg’s Chamber of Commerce, which has published a new practical guide. ‘Developing your business: actions to take and mistakes to avoid’, was written to respond to  the needs and answer the common questions of entrepreneurs.  “Testimonials, practical tools, expert insights and presentations from key players in our ecosystem have been brought together to create a comprehensive toolkit that you can consult at any stage of your journey,” the introduction… Source link

by WIRED

B&H Photo is one of our favorite places to shop for camera gear. If you’re ever in New York, head to the store to check out the giant overhead conveyor belt system that brings your purchase from the upper floors to the registers downstairs (yes, seriously, here’s a video). Fortunately B&H Photo’s website is here for the rest of us with some good deals on photo gear we love. Save on the Latest Gear at B&H Photo B&H Photo has plenty of great deals, including Nikon’s brand-new Z6III full-frame… Source link

by Gizmodo

Long before Edgar Wright’s The Running Man hits theaters this week, the director of Shaun of the Dead and Hot Fuzz had been thinking about making it. He read the original 1982 novel by Stephen King (under his pseudonym Richard Bachman) as a boy and excitedly went to theaters in 1987 to see the film version, starring Arnold Schwarzenegger. Wright enjoyed the adaptation but was a little let down by just how different it was from the novel. Years later, after he’d become a successful… Source link