Efficient Vision Encoding for Vision Language Models
Apple
Verified
AI & ML interests
None defined yet.
Recent Activity
View all activity
Organization Card
Welcome to the official Hugging Face organization for Apple!
Apple Core ML β Build intelligence into your apps
Core ML is optimized for on-device performance of a broad variety of model types by leveraging Apple Silicon and minimizing memory footprint and power consumption.
- Models
- Depth Anything V2 Core ML: State-of-the-art depth estimation
- DETR Resnet50 Core ML: Semantic Segmentation
- FastViT Core ML: Image Classification
- Stable Diffusion Core ML
- Additional Core ML Model Gallery Models
Apple Machine Learning Research
Open research to enable the community to deliver amazing experiences that improve the lives of millions of people every day.
Models
- DepthPro: State-of-the-art monocular depth estimation.
- OpenELM Base | Instruct: open, Transformer-based language model.
- MobileCLIP: Mobile-friendly image-text models.
- DCLM: State-of-the-art open data language models via dataset curation.
- DFN: State-of-the-art open data CLIP models via dataset curation.
Datasets
- FLAIR: A large image dataset for federated learning.
- DataCompDR: Improved datasets for training image-text models.
Benchmarks
- TiC-CLIP: Benchmark for the design of efficient continual learning of image-text models over years
Select Highlights and Other Resources
- Hugging Face CoreML Examples β Run Core ML models with two lines of code!
- Apple Model Gallery
- New features in Core ML Tools
- Apple Core ML Stable Diffusion β Library to run Stable Diffusion on Apple Silicon with Core ML.
- Hugging Face Blog Posts
models
142

apple/FastVLM-0.5B
Text Generation
β’
0.8B
β’
Updated
β’
3.11k
β’
109

apple/FastVLM-1.5B
Text Generation
β’
2B
β’
Updated
β’
786
β’
21

apple/FastVLM-7B
Text Generation
β’
8B
β’
Updated
β’
2.26k
β’
100

apple/mobileclip2_coca_dfn2b_s13b_recap-coco-30k_s12m_context77
Updated
β’
5

apple/mobileclip2_coca_dfn2b_s13b_recap-coco-30k_s12m_context256
Updated
β’
1

apple/mobileclip2_coca_dfn2b_s13b_recap-coco-30k_s12m_context128
Updated
β’
4

apple/mobileclip2_coca_dfn2b_s13b_mscoco38k_s12m_context77
Updated
β’
2

apple/mobileclip2_coca_dfn2b_s13b_gbc1m-short_context77
Updated
β’
4

apple/mobileclip2_coca_dfn2b_s13b_gbc1m-long_context256
Updated
β’
6

apple/mobileclip2_coca_dfn2b_s13b_gbc10m-short-relation_context256
Updated
β’
4
datasets
9
apple/DataCompDR-12M-bf16
Updated
β’
297
apple/DataCompDR-12M
Viewer
β’
Updated
β’
12.8M
β’
1.5k
β’
31
apple/DataCompDR-1B
Viewer
β’
Updated
β’
1.28B
β’
236k
β’
25
apple/DataComp-12M
Viewer
β’
Updated
β’
12.8M
β’
108
β’
3
apple/GSM-Symbolic
Viewer
β’
Updated
β’
12.5k
β’
6.03k
β’
19
apple/mmau
Preview
β’
Updated
β’
139
β’
4
apple/TiC-DataComp
Preview
β’
Updated
β’
2.98k
β’
3
apple/flair
Viewer
β’
Updated
β’
429k
β’
260
β’
15
apple/mkqa
Updated
β’
557
β’
40