NuminaMath, a model developed by Numina and Hugging Face, won the 1st Progress Prize of the AI Math Olympiad (AIMO) by fine-tuning open LLMs to solve 29 out of 50 math problems on the private test set. The competition involved solving difficult math problems used by high school students to train for the International Math Olympiad. The winning model, NuminaMath 7B TIR, was developed through the Numina initiative, an open AI4Maths initiative.
AI | Rating: 61 | 2024-07-11 03:46:26 PM |
Hugging Face and Keras NLP have integrated, allowing users to access a first batch of 33 pre-trained models from the Hugging Face Hub. The integration is compatible with KerasNLP-based models, and users can train and fine-tune models, then upload them back to the Hub. The Hugging Face Hub currently hosts over 750,000 public models, with 346,268 built using the Transformers library.
AIHuggingFace | Rating: 89 | 2024-07-10 02:27:53 PM |
Presidio is experimenting with automatic PII detection on the Hub, a feature aimed at addressing the issue of undocumented private information about individuals in machine learning datasets. The feature is being tested on two types of datasets containing PII: annotated PII datasets and datasets specifically designed to train PII detection models. The PII-Masking-300k dataset by Ai4Privacy is an example of the latter.
AI | Rating: 58 | 2024-07-10 01:56:15 PM |
Preference optimization is an alternative approach to traditional methods for training vision language models (VLMs) to understand and predict human preferences. This method focuses on comparing and ranking candidate answers rather than assigning fixed labels, allowing models to capture subtleties of human judgment more effectively. Preference optimization is widely used for fine-tuning language models, but it can also be applied to VLMs.
AIOpenAI | Rating: 82 | 2024-07-10 01:36:29 PM |
Banque des Territoires, part of the Caisse des Dépôts et Consignations group, collaborated with Polyconseil and Hugging Face to develop a sovereign data solution. The project aims to support the national strategy for schools' environmental renovation through the EduRénov program. The solution is intended to optimize the support framework of the program, which is dedicated to the ecological renovation of schools.
AI | Rating: 53 | 2024-07-09 07:06:21 AM |
Google Cloud TPUs are now available to Hugging Face users, allowing AI builders to accelerate their applications. TPUs are custom-made AI hardware designed by Google, known for their ability to scale cost-effectively and deliver impressive performance across various AI workloads. This collaboration aims to provide users with the best tools and infrastructure for their AI projects.
AI | Rating: 71 | 2024-07-08 04:26:17 PM |
The Hugging Face Dataset Hub has shared over 180,000 public datasets, and to improve dataset discoverability and visualization, the organization is announcing four new features for Dataset Search. These features will help researchers and engineers find, explore, and transform datasets for various tasks, including training LLMs and evaluating automatic speech recognition or computer vision systems.
AIHuggingFace | Rating: 89 | 2024-07-08 12:26:24 PM |
MILA and Intel Labs released ProtST, a multi-modal language model for protein design, at the International Conference on Machine Learning 2023. The model has been well-received, accumulating over 40 citations in less than a year. ProtST is capable of predicting the subcellular location of an amino acid sequence, a popular task in protein language models.
AI | Rating: 56 | 2024-07-04 02:36:16 PM |
Our Transformers Code Agent has beaten the GAIA benchmark, a challenging and comprehensive agent benchmark. The GAIA benchmark is a test for agents, which are systems based on large language models that can call external tools and iterate on further steps based on the model's output. The benchmark is considered tough, and the Code Agent's performance is impressive.
AITransformers | Rating: 81 | 2024-07-01 01:54:05 PM |
Google has released Gemma 2, a new open large language model (LLM) with two sizes, 9 billion and 27 billion parameters, and base and instruction-tuned versions. The model is based on Google Deepmind Gemini and has a context length. The models are available on the Hub and can be integrated with Hugging Face Transformers and Google Cloud, with inference endpoints also available.
AI | Rating: 70 | 2024-06-27 03:55:41 PM |