Need smarter insights in your inbox? Join our weekly newsletters to get solely what issues to enterprise AI, knowledge, and safety leaders. Subscribe Now
Deciding on AI fashions is as a lot of a technical resolution and it’s a strategic one. However selecting open, closed or hybrid fashions all have trade-offs.
Whereas talking at this 12 months’s VB Rework, mannequin structure consultants from Basic Motors, Zoom and IBM mentioned how their corporations and clients contemplate AI mannequin choice.
Barak Turovsky, who in March turned GM’s first chief AI officer, mentioned there’s numerous noise with each new mannequin launch and each time the leaderboard modifications. Lengthy earlier than leaderboards had been a mainstream debate, Turovsky helped launch the primary giant language mannequin (LLM) and recalled the methods open-sourcing AI mannequin weights and coaching knowledge led to main breakthroughs.
“That was frankly in all probability one of many largest breakthroughs that helped OpenAI and others to begin launching,” Turovsky mentioned. “So it’s really a humorous anecdote: Open-source really helped create one thing that went closed and now perhaps is again to being open.”
Elements for selections differ and embrace price, efficiency, belief and security. Turovsky mentioned enterprises generally choose a blended technique — utilizing an open mannequin for inside use and a closed mannequin for manufacturing and buyer going through or vice versa.
IBM’s AI technique
Armand Ruiz, IBM’s VP of AI platform, mentioned IBM initially began its platform with its personal LLMs, however then realized that wouldn’t be sufficient — particularly as extra highly effective fashions arrived available on the market. The corporate then expanded to supply integrations with platforms like Hugging Face so clients may decide any open-source mannequin. (The corporate not too long ago debuted a brand new mannequin gateway that offers enterprises an API for switching between LLMs.)
Extra enterprises are selecting to purchase extra fashions from a number of distributors. When Andreessen Horowitz surveyed 100 CIOs, 37% of respondents mentioned they had been utilizing 5 or extra fashions. Final 12 months, solely 29% had been utilizing the identical quantity.
Selection is essential, however generally an excessive amount of selection creates confusion, mentioned Ruiz. To assist clients with their strategy, IBM doesn’t fear an excessive amount of about which LLM they’re utilizing through the proof of idea or pilot section; the primary purpose is feasibility. Solely later they start to have a look at whether or not to distill a mannequin or customise one primarily based on a buyer’s wants.
“First we attempt to simplify all that evaluation paralysis with all these choices and concentrate on the use case,” Ruiz mentioned. “Then we work out what’s the finest path for manufacturing.”
How Zoom approaches AI
Zoom’s clients can select between two configurations for its AI Companion, mentioned Zoom CTO Xuedong Huang. One includes federating the corporate’s personal LLM with different bigger basis fashions. One other configuration permits clients involved about utilizing too many fashions to make use of simply Zoom’s mannequin. (The corporate additionally not too long ago partnered with Google Cloud to undertake an agent-to-agent protocol for AI Companion for enterprise workflows.)
The corporate made its personal small language mannequin (SLM) with out utilizing buyer knowledge, Huang mentioned. At 2 billion parameters, the LLM is definitely very small, however it will possibly nonetheless outperform different industry-specific fashions. The SLM works finest on advanced duties when working alongside a bigger mannequin.
“That is actually the ability of a hybrid strategy,” Huang mentioned. “Our philosophy could be very easy. Our firm is main the best way very very like Mickey Mouse and the elephant dancing collectively. The small mannequin will carry out a really particular process. We aren’t saying a small mannequin will probably be ok…The Mickey Mouse and elephant will probably be working collectively as one workforce.”