Speaker Interview with Mouli Narayanan, Zeblok Computational
1. Why is Intelligent Infrastructure such an exciting market today?
Gartner says that 75% of all data will be created outside of traditional data centers by 2025. This changes everything. MIT Tech Review explores the revolution that the ability to compute everywhere will bring to the industry.
To process data in real-time, the compute capability must be closer in proximity to the data, including retail locations, manufacturing shop floors, oil rigs, etc. The ubiquity of computing at Edge locations is a precursor to intelligent infrastructure. Curation of next-generation digital assets, i.e., Ai inferencing, realizes the “intelligent” prefix to the word infrastructure.
Those providing infrastructure must address the continuum of learning, inferencing, and relearning to deliver intelligent infrastructure. This brings us to the need for ML Dev Ops from public cloud-to-Edge, because computational needs will require that model training run on public clouds or large on-prem deployments for the conceivable future.
Intelligent infrastructure must bridge three gaps to meet the needs of customers and the collective Ai ecosystem.
- Model-to-API gap – will allow customers to think of Ai assets like content.
- Price-performance gap – will lower the total cost of ownership by running Ai inferencing on the appropriate system/chip architecture.
- Data hauling gap – will enable staging Edge infrastructure metadata and throttle the volume of data backhauled to cloud environments.
At Zeblok Computational, we are focused on these priorities. Our flagship product, Ai-MicroCloud®, addresses these gaps by offering a Cloud-to-Edge ML DevOps platform that allows our customers to mix and match Ai ISVs and vendors at scale to deliver Edge Ai applications supporting full deployment lifecycle.
So intelligent infrastructure is exciting because it is the culmination of cloud and Ai technologies at the intersection of a significant paradigm shift, bringing compute to data.
2. As a proponent of the multi-tenancy approach to deploying Edge computing, why is it that Zeblok sees this as so important?
After years of product investments in public cloud environments, market participants (vendors, customers, channel partners) are familiar with multi-tenancy. Economies of scale have been honed on these deployments, driving the costs down. Edge Cloud is relatively nascent. We anticipate early investments in Edge Computing to be more dedicated and driven by early enterprise adopters. But like with any other technology lowering the cost of ownership per insight is an important KPI for generating value for customers. Un-utilized capacity at Edge locations can be offer monetizing prospects for those making early investments when the secure multi-tenant capability is available. Furthermore, resultant shareability from multi-tenancy at the Edge networks on Intelligent Infrastructure fosters collaboration and product innovation.
3. Can you give some key application areas you expect to be enabled by the multi-tenancy Edge platforms?
I do not believe initial Edge computing success and ROI are connected to multi-tenancy. But specifically in the context of multi-tenancy, Smart Kiosk applications in various market segments like stadiums, retail, smart city deployments, etc., would be a good example. Computer vision-driven low-latency inferencing as API running in Edge MECs represents another important application segment. Edge data center operators also require multi-tenancy to maximize their returns. Multi-tenancy can also add a new revenue stream to government investments in bridging the digital divide.
4. What do you see as the key challenges in this market where the industry needs to collaborate for the full potential of the opportunity to be realized?
There are two broad challenges. To deliberately use cliched analogies, I will call these “chicken-and-egg” and “all-or-nothing” challenges.
Edge data center (MEC) operators need cloud service providers (CSPs) on the edge infrastructure. CSPs need use cases to attract enterprise customers. Due to bespoke engineering, the enterprises have additional barriers to demonstrating monetizable use cases. So who makes the initial investment is a “chicken-and-egg” challenge.
Solutions like EV charging require large-scale deployments running into thousands of Edge locations. Autonomous vehicles needing intelligent Edge computing again need to run at scale. Where are the smaller deployments? For solution providers, this is an “all-or-nothing” challenge that prevents the incremental growth and maturity of the offerings.
Financial participants have gotten more creative in underwriting Edge deployments, but will only do so if they can be guaranteed to be on the better side of the “all-or-nothing” challenge, rather than supporting trial implementations that have the potential to become large scale deployments.
To realize full potential, the market participants in the ecosystem must focus on smaller investments in putting together fully functioning EdgeLabs to move the focus away from “proof of concepts” to a subscription-driven business model with “try-and-buy”. Zeblok Computational, in concert with our partners, provides products and services to overcome these challenges.
5. What are you looking forward to at the Intelligent Infrastructure conference in Austin?
Zeblok Computational provides a SaaS middleware product that is essentially a software-defined Ai ecosystem. Our network of partners from AI ISVs, OEM manufacturers, Chipmakers, Edge data center operators to cloud governance ISV makes us uniquely comprehensive to help turn companies’ aspirations at the Edge into reality
By 2035 there will be one trillion Edge devices, requiring many millions of MECs. Zeblok’s Ai-MicroCloud® can bring intelligence to your infrastructure today, enabling you to seize your share of that future reality.