Advertisment

Coredge IO unveils affordable AI inference platform at IMC '24

Coredge IO unveils an affordable AI Inference-as-a-Service platform powered by Qualcomm at IMC 2024, offering scalable AI solutions for businesses of all sizes.

author-image
Punam Singh
New Update
VIAVI unveils ‘VAMOS’

Coredge IO unveils affordable AI inference platform at IMC '24

At the India Mobile Congress (IMC) 2024, Coredge IO introduced a new AI Inference-as-a-Service platform to make advanced AI accessible and affordable for businesses of all sizes. This platform, powered by Qualcomm’s AI 100 Ultra, aims to provide a cost-effective solution for running AI models at scale, addressing the high costs traditionally associated with AI inference. 

Advertisment

Coredge IO, recently acquired by Sirius Digitech, a joint venture between Adani Group and Sirius International Holding, is committed to developing cutting-edge, sovereign cloud and AI technologies for government and enterprise clients. The AI Inference-as-a-Service platform is designed to offer users a flexible, pay-as-you-go model, ensuring they only pay for the computational resources used during AI inference, significantly lowering the total cost of ownership (TCO).

Qualcomm, a key technology partner in this initiative, sees the partnership with Coredge IO as an important step towards advancing India’s AI capabilities. Qualcomm India’s Senior Vice President, Savi Soin, noted that the platform would help create new opportunities for developers to build advanced cloud applications at a fraction of traditional costs. The AI 100 Ultra accelerator, known for its power efficiency and ability to handle large language models and generative AI, ensures that businesses can deploy AI models without incurring prohibitive expenses.

Coredge IO’s AI Inference-as-a-Service platform is designed to cater to a diverse audience, from startups and small businesses to large enterprises and independent AI developers. The platform’s scalability ensures that businesses can deploy AI solutions efficiently, even if they lack the resources to maintain dedicated AI infrastructure. It also supports regional language models, aligning with India’s Bhashini initiative, which focuses on local language AI development.

Advertisment

In addition to providing a cost-effective AI solution, Coredge IO plans to expand the platform’s capabilities with future integration into a decentralised Web 3.0 architecture. This transition would enable resource sharing across a secure, distributed network, making AI inference even more accessible. 

At IMC 2024, Coredge demonstrated the platform’s capabilities to industry leaders, government officials, and AI innovators, showing how it could drive the future of AI in India. With its focus on affordability, scalability, and regional language support, the platform is set to make a significant impact on AI adoption across various industries. 

Advertisment