DavidAU 's Collections

Coders and Programmers: 1x to 10x MOE - Mixture of Experts

You can dial up/down the number of experts. 32k to 128k context versions. Some models also include reasoning/thinking. Links to quants at the repos.