🤖 Now on #KaggleModels! AI at Meta's largest openly available foundation model, Llama 3.1, excels in synthetic data generation and distillation with its 405B parameters, supporting 8 languages and a 128K context window. Also launching 7B and 80B models! 👉 Learn more: https://lnkd.in/ev72HPpp
Über uns
Kaggle provides cutting-edge data science, faster and better than most people ever thought possible. We have a proven track-record of solving real-world problems across a diverse array of industries including pharmaceuticals, financial services, energy, information technology, and retail. Kaggle offers both public and private data science competitions and on-demand consulting by an elite global talent pool.
- Website
-
http://www.kaggle.com
External link for Kaggle
- Industrie
- IT-Dienstleistungen und IT-Beratung
- Größe des Unternehmens
- 11-50 Mitarbeiter
- Hauptsitz
- San Francisco, Kalifornien
- Typ
- In Privatbesitz
- Gegründet
- 2010
- Spezialitäten
- open data, predictive modeling, machine learning, and data science
Standorte
-
Primäre
188 King Street #502
San Francisco, California 94107, US
Employees at Kaggle
Aktualisierungen
-
🤖 Now on #KaggleModels! Part of the Cohere and Cohere For AI family, Command R Plus is a multilingual (10 languages) model with advanced capabilities. This 104B parameter model includes RAG and tool automation and is optimized for reasoning, summarization, and question answering. 👉 Learn more: https://goo.gle/4bZYmGD
CohereForAI | Command R Plus | Kaggle
kaggle.com
-
🤖 AI at Meta's Llama 3.1 is now available on #KaggleModels! 👉 Learn more: https://lnkd.in/ev72HPpp
Introducing Meta Llama 3: the next generation of our state-of-the-art open source large language model — and the most capable openly available LLM to date. These next-generation models demonstrate SOTA performance on a wide range of industry benchmarks and offer new capabilities such as improved reasoning. Details in the full announcement ➡️ https://go.fb.me/a24u0h Download the models ➡️ https://go.fb.me/q8yhmh Experience Llama 3 with Meta AI ➡️ https://meta.ai Llama 3 8B & 70B deliver a major leap over Llama 2 and establish a new SOTA for models of their sizes. While we’re releasing these first two models today, we’re working to release even more for Llama 3 including multiple models with capabilities such as multimodality, multilinguality, longer context windows and more. Our largest models are over 400B parameters and while they’re still in active development, we’re very excited about how they’re trending. Across the stack, we want to kickstart the next wave of innovation in AI. We believe these are the best open source models of their class, period — we can’t wait to see what you build and look forward to your feedback.
-
🤖 Now on #KaggleModels! HHEM is an open source model from Vectara that detects hallucinations in LLMs. Great for retrieval-augmented-generation (RAG) applications and other contexts too. 👉 Learn more: https://lnkd.in/e2sbj5fv
hallucination Evaluation Model (HHEM)
kaggle.com
-
Discover how to fine-tune Gemma-2-9b-it and compete in the LMSYS Chatbot Arena competition! Check out these insightful notebooks and elevate your AI skills. 👉 Training: https://lnkd.in/e5SFfFTS 👉 Inference: https://lnkd.in/ePWHUHdu 👉 Model: https://lnkd.in/esmzT6dw
-
📚 Check out this fantastic notebook by Daniel Han, the co-creator of Unsloth AI! Discover how to fine-tune Gemma-2-9b using Kaggle notebooks. 👉 Learn more: https://goo.gle/3VZn97S
Kaggle Gemma2 9b Unsloth notebook
kaggle.com
-
🤖 Now on #KaggleModels! AI at Meta's Llama 3 offers pretrained and fine-tuned generative text models ranging from 8B to 70B in size. Use for assistant-like chat or natural language generation tasks. 👉 Learn more: https://lnkd.in/euX43xDu
Llama 3
kaggle.com
-
📢 We’ve heard your requests, and we’re thrilled to announce the launch of Competition Certificates! Inspired by the success of our shareable certificates on Kaggle Learn, you can now proudly showcase your competition achievements publicly. 👏 🏅Medal winners can access their certificates via the competition detail page, leaderboard, and the “Your Work” section. 📧 A link to your certificate will be included in a new email sent to all competition medalists once the leaderboard is finalized. 🏆 Certificates are available to all active and former medalists, dating back to our earliest 2010 competitions. 👉 More details: https://goo.gle/3WfD6rC
-
📣 KaggleX Fellowship program 2024 is seeking new advisors 👏🎉 Share your knowledge & mentor early-career data scientists! 👉 https://lnkd.in/egcGtRuN As an advisor, you will guide fellows to build a chatbot by fine-tuning Google's Gemma open models using custom conversation-style datasets. Take the next step to join the KaggleX community by July 21, 2024: https://lnkd.in/eq_QjAvy
KaggleX Fellowship Program
kaggle.com
-
🤖 Now on #KaggleModels! 🚀 Start exploring the possibilities with Gemma 2! Learn more: https://lnkd.in/e7ezdSYn
Gemma 2 is officially here! 🥳 Learn how you can access it → https://goo.gle/3RLQXUa Available in both 9B and 27B parameter sizes, Gemma 2 is higher performing than ever before. Download the weights on Kaggle and Hugging Face, or access the models in Google AI Studio. We’ve also released a new Gemma cookbook to help developers build their own applications and fine-tune Gemma 2 models for specific tasks responsibly.