Spaces:
Runtime error
Runtime error
Merge branch 'main' of hf.co:spaces/fdaudens/podcast-jobs
Browse files- .gitattributes +1 -0
- podcasts/podcast-2025-05-21.wav +3 -0
- rss.xml +11 -1
.gitattributes
CHANGED
@@ -50,3 +50,4 @@ podcasts/podcast-2025-05-15.wav filter=lfs diff=lfs merge=lfs -text
|
|
50 |
podcasts/podcast-2025-05-16.wav filter=lfs diff=lfs merge=lfs -text
|
51 |
podcasts/podcast-2025-05-19.wav filter=lfs diff=lfs merge=lfs -text
|
52 |
podcasts/podcast-2025-05-20.wav filter=lfs diff=lfs merge=lfs -text
|
|
|
|
50 |
podcasts/podcast-2025-05-16.wav filter=lfs diff=lfs merge=lfs -text
|
51 |
podcasts/podcast-2025-05-19.wav filter=lfs diff=lfs merge=lfs -text
|
52 |
podcasts/podcast-2025-05-20.wav filter=lfs diff=lfs merge=lfs -text
|
53 |
+
podcasts/podcast-2025-05-21.wav filter=lfs diff=lfs merge=lfs -text
|
podcasts/podcast-2025-05-21.wav
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:9b572b8ee9aab0e38f3c934f07fe695ff1b830e470439b3c38990313b17f2bed
|
3 |
+
size 8205644
|
rss.xml
CHANGED
@@ -14,7 +14,17 @@
|
|
14 |
<itunes:email>florent.daudens@hf.co</itunes:email>
|
15 |
</itunes:owner>
|
16 |
<itunes:image href="https://huggingface.co/spaces/fdaudens/podcast-jobs/resolve/main/images/cover3.png" />
|
17 |
-
<lastBuildDate>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
18 |
<item>
|
19 |
<title>Solving the Scaling Paradox with Chain-of-Language-Model</title>
|
20 |
<description>We'll explore the groundbreaking concept of Chain-of-Language-Model (CoLM) which offers a novel approach to scale up language models while preserving training efficiency and enabling elastic inference by integrating multi-scale training within a single forward propagation.</description>
|
|
|
14 |
<itunes:email>florent.daudens@hf.co</itunes:email>
|
15 |
</itunes:owner>
|
16 |
<itunes:image href="https://huggingface.co/spaces/fdaudens/podcast-jobs/resolve/main/images/cover3.png" />
|
17 |
+
<lastBuildDate>Wed, 21 May 2025 14:28:49 +0000</lastBuildDate>
|
18 |
+
<item>
|
19 |
+
<title>BAGEL Makes Waves in Multimodal Pretraining</title>
|
20 |
+
<description>BAGEL, a groundbreaking new multimodal foundation model, is revolutionizing the way we approach unified multimodal understanding and generation. Trained on trillions of tokens curated from diverse multimodal data, BAGEL exhibits emerging properties in complex multimodal reasoning, effortlessly navigating tasks like free-form image manipulation and future frame prediction while outperforming top-tier open-source VLMs on standard benchmarks.
|
21 |
+
|
22 |
+
[Read the paper on Hugging Face](https://huggingface.co/papers/2505.14683)</description>
|
23 |
+
<pubDate>Wed, 21 May 2025 14:28:49 +0000</pubDate>
|
24 |
+
<enclosure url="https://huggingface.co/spaces/fdaudens/podcast-jobs/resolve/main/podcasts/podcast-2025-05-21.wav" length="8205644" type="audio/wav" />
|
25 |
+
<guid>https://huggingface.co/spaces/fdaudens/podcast-jobs/resolve/main/podcasts/podcast-2025-05-21.wav</guid>
|
26 |
+
<itunes:explicit>false</itunes:explicit>
|
27 |
+
</item>
|
28 |
<item>
|
29 |
<title>Solving the Scaling Paradox with Chain-of-Language-Model</title>
|
30 |
<description>We'll explore the groundbreaking concept of Chain-of-Language-Model (CoLM) which offers a novel approach to scale up language models while preserving training efficiency and enabling elastic inference by integrating multi-scale training within a single forward propagation.</description>
|