IBM Granite
Developer(s) | IBM Research[1] |
---|---|
Initial release | November 7, 2023 |
Platform | IBM Watsonx (initially) GitHub Hugging Face RHEL AI |
Type | |
License | Proprietary Code models: Open Source (Apache 2.0)[2] |
Part of a series on |
Machine learning and data mining |
---|
IBM Granite is a series of decoder-only AI foundation models created by IBM. It was announced on September 7, 2023,[3][4] and an initial paper was published 4 days later.[5] Initially intended for use in the IBM's cloud-based data and generative AI platform Watsonx along with other models,[6] IBM opened the source code of some code models.[7] Granite models are trained on datasets curated from Internet, academic publishings, code datasets, legal and finance documents.[8][9][1]
Foundation models
[edit]A foundation model is an AI model trained on broad data at scale such that it can be adapted to a wide range of downstream tasks.[10]
Granite's first foundation models were Granite.13b.instruct and Granite.13b.chat. The "13b" in their name comes from 13 billion, the amount of parameters they have as models, lesser than most of the larger models of the time. Later models vary from 3 to 34 billion parameters.[3][11]
On May 6, 2024, IBM released the source code of four variations of Granite Code Models under Apache 2, an open source permissive license that allows completely free use, modification and sharing of the software, and put them on Hugging Face for public use.[12][13] According to IBM's own report, Granite 8b outperforms Llama 3 on several coding related tasks within similar range of parameters.[14][15]
See also
[edit]- Mistral AI, a company that also provides open source models
- GPT
- LLaMA
- Cyc
- Gemini
References
[edit]- ^ a b McDowell, Steve. "IBM's New Granite Foundation Models Enable Safe Enterprise AI". Forbes.
- ^ ibm-granite/granite-code-models, IBM Granite, 2024-05-08, retrieved 2024-05-08
- ^ a b Nirmal, Dinesh (September 7, 2023). "Building AI for business: IBM's Granite foundation models". IBM.
- ^ "IBM debuts Granite series of hardware-efficient language models". September 7, 2023.
- ^ "Granite Foundation Models" (PDF). IBM. 2023-11-30.
- ^ Fritts, Harold (2024-04-22). "IBM Adds Meta Llama 3 To watsonx, Expands AI Offerings". StorageReview.com. Retrieved 2024-05-08.
- ^ Jindal, Siddharth (2024-05-07). "IBM Releases Open-Source Granite Code Models, Outperforms Llama 3". Analytics India Magazine. Retrieved 2024-05-08.
- ^ Azhar, Ali (2024-04-08). "IBM Patents a Faster Method to Train LLMs for Enterprises". Datanami. Retrieved 2024-05-08.
- ^ Wiggers, Kyle (2023-09-07). "IBM rolls out new generative AI features and models". TechCrunch. Retrieved 2024-05-08.
- ^ "Introducing the Center for Research on Foundation Models (CRFM)". Stanford HAI. 18 August 2021.
- ^ Pawar, Sahil (2023-09-11). "IBM Introduces Granite Series LLM Models for Watsonx Platform". Analytics Drift. Retrieved 2024-05-09.
- ^ Nine, Adrianna (May 7, 2024). "IBM Makes Granite AI Models Open-Source Under New InstructLab Platform". ExtremeTech.
- ^ "IBM open-sources its Granite AI models - and they mean business". ZDNET. Retrieved 2024-05-21.
- ^ Jindal, Siddharth (2024-05-07). "IBM Releases Open-Source Granite Code Models, Outperforms Llama 3". Analytics India Magazine. Retrieved 2024-05-09.
- ^ Synced (2024-05-13). "IBM's Granite Code: Powering Enterprise Software Development with AI Precision | Synced". syncedreview.com. Retrieved 2024-05-21.
External links
[edit]