fdaudens HF Staff commited on
Commit
48d3f91
·
verified ·
1 Parent(s): 56a84cc

Upload rss.xml with huggingface_hub

Browse files
Files changed (1) hide show
  1. rss.xml +11 -1
rss.xml CHANGED
@@ -14,7 +14,17 @@
14
  <itunes:email>florent.daudens@hf.co</itunes:email>
15
  </itunes:owner>
16
  <itunes:image href="https://huggingface.co/spaces/fdaudens/podcast-jobs/resolve/main/images/cover3.png" />
17
- <lastBuildDate>Tue, 20 May 2025 18:46:38 +0000</lastBuildDate>
 
 
 
 
 
 
 
 
 
 
18
  <item>
19
  <title>Solving the Scaling Paradox with Chain-of-Language-Model</title>
20
  <description>We'll explore the groundbreaking concept of Chain-of-Language-Model (CoLM) which offers a novel approach to scale up language models while preserving training efficiency and enabling elastic inference by integrating multi-scale training within a single forward propagation.</description>
 
14
  <itunes:email>florent.daudens@hf.co</itunes:email>
15
  </itunes:owner>
16
  <itunes:image href="https://huggingface.co/spaces/fdaudens/podcast-jobs/resolve/main/images/cover3.png" />
17
+ <lastBuildDate>Wed, 21 May 2025 14:28:49 +0000</lastBuildDate>
18
+ <item>
19
+ <title>BAGEL Makes Waves in Multimodal Pretraining</title>
20
+ <description>BAGEL, a groundbreaking new multimodal foundation model, is revolutionizing the way we approach unified multimodal understanding and generation. Trained on trillions of tokens curated from diverse multimodal data, BAGEL exhibits emerging properties in complex multimodal reasoning, effortlessly navigating tasks like free-form image manipulation and future frame prediction while outperforming top-tier open-source VLMs on standard benchmarks.
21
+
22
+ [Read the paper on Hugging Face](https://huggingface.co/papers/2505.14683)</description>
23
+ <pubDate>Wed, 21 May 2025 14:28:49 +0000</pubDate>
24
+ <enclosure url="https://huggingface.co/spaces/fdaudens/podcast-jobs/resolve/main/podcasts/podcast-2025-05-21.wav" length="8205644" type="audio/wav" />
25
+ <guid>https://huggingface.co/spaces/fdaudens/podcast-jobs/resolve/main/podcasts/podcast-2025-05-21.wav</guid>
26
+ <itunes:explicit>false</itunes:explicit>
27
+ </item>
28
  <item>
29
  <title>Solving the Scaling Paradox with Chain-of-Language-Model</title>
30
  <description>We'll explore the groundbreaking concept of Chain-of-Language-Model (CoLM) which offers a novel approach to scale up language models while preserving training efficiency and enabling elastic inference by integrating multi-scale training within a single forward propagation.</description>