GEN-0 / Embodied Foundation Models That Scale with Physical Interaction
https://generalistai.com/blog/nov-04-2025-GEN-0
#HackerNews #GEN0 #EmbodiedAI #FoundationModels #PhysicalInteraction #AIresearch
#Tag
GEN-0 / Embodied Foundation Models That Scale with Physical Interaction
https://generalistai.com/blog/nov-04-2025-GEN-0
#HackerNews #GEN0 #EmbodiedAI #FoundationModels #PhysicalInteraction #AIresearch
Why Fei-Fei Li and Yann LeCun Are Both Betting on "World Models"
https://entropytown.com/articles/2025-11-13-world-model-lecun-feifei-li/
#HackerNews #FeiFeiLi #YannLeCun #WorldModels #AIResearch #MachineLearning
Reverse engineering a neural network's clever solution to binary addition (2023)
https://cprimozic.net/blog/reverse-engineering-a-small-neural-network/
#HackerNews #ReverseEngineering #NeuralNetworks #BinaryAddition #AIResearch #2023 #Insights
From Memorization to Reasoning in the Spectrum of Loss Curvature
https://arxiv.org/abs/2510.24256
#HackerNews #Memorization #Reasoning #LossCurvature #MachineLearning #AIResearch
TabPFN-2.5 – SOTA foundation model for tabular data
https://priorlabs.ai/technical-reports/tabpfn-2-5-model-report
#HackerNews #TabPFN2.5 #SOTA #TabularData #FoundationModel #AIResearch
The Smol Training Playbook: The Secrets to Building World-Class LLMs
https://huggingface.co/spaces/HuggingFaceTB/smol-training-playbook
#HackerNews #SmolTrainingPlaybook #LLMs #AIResearch #MachineLearning #HuggingFace
Reasoning Models Reason Well, Until They Don't
https://arxiv.org/abs/2510.22371
#HackerNews #ReasoningModels #ReasonWell #AIResearch #MachineLearning #HackerNews
Language Models Are Injective and Hence Invertible
https://arxiv.org/abs/2510.15511
#HackerNews #LanguageModels #Invertibility #AIResearch #NaturalLanguageProcessing #MachineLearning
Cursor Composer: Building a fast frontier model with RL
https://cursor.com/blog/composer
#HackerNews #CursorComposer #FastFrontier #RLModel #AIResearch
The Continual Learning Problem
https://jessylin.com/2025/10/20/continual-learning/
#HackerNews #ContinualLearning #MachineLearning #AIResearch #EducationTech
Artificial Writing and Automated Detection [pdf]
https://www.nber.org/system/files/working_papers/w34223/w34223.pdf
#HackerNews #ArtificialWriting #AutomatedDetection #AIResearch #NBER #PDF
🚨 Ex-OpenAI CTO Mira Murati just launched Tinker, a new service from Thinking Machines Lab.
Tinker strips AI training down to 4 simple functions — you focus on data + algorithms, it handles the GPU chaos.
Is this the Kubernetes moment for AI training?
https://dropletdrift.com/ex-openai-cto-mira-murati-launches-tinker-to-simplify-ai-model-training/
#AI #ArtificialIntelligence #MachineLearning #DeepLearning #AIresearch #LLM #OpenSource #Tech #Innovation #DataScience #NeuralNetworks #FutureOfAI #AIcommunity #AIethics #Startups #OpenAI #Developers #Research #Computing
🚨 Ex-OpenAI CTO Mira Murati just launched Tinker, a new service from Thinking Machines Lab.
Tinker strips AI training down to 4 simple functions — you focus on data + algorithms, it handles the GPU chaos.
Is this the Kubernetes moment for AI training?
https://dropletdrift.com/ex-openai-cto-mira-murati-launches-tinker-to-simplify-ai-model-training/
#AI #ArtificialIntelligence #MachineLearning #DeepLearning #AIresearch #LLM #OpenSource #Tech #Innovation #DataScience #NeuralNetworks #FutureOfAI #AIcommunity #AIethics #Startups #OpenAI #Developers #Research #Computing
Hey everyone 👋
I’m diving deeper into running AI models locally—because, let’s be real, the cloud is just someone else’s computer, and I’d rather have full control over my setup. Renting server space is cheap and easy, but it doesn’t give me the hands-on freedom I’m craving.
So, I’m thinking about building my own AI server/workstation! I’ve been eyeing some used ThinkStations (like the P620) or even a server rack, depending on cost and value. But I’d love your advice!
My Goal:
Run larger LLMs locally on a budget-friendly but powerful setup. Since I don’t need gaming features (ray tracing, DLSS, etc.), I’m leaning toward used server GPUs that offer great performance for AI workloads.
Questions for the Community:
1. Does anyone have experience with these GPUs? Which one would you recommend for running larger LLMs locally?
2. Are there other budget-friendly server GPUs I might have missed that are great for AI workloads?
3. Any tips for building a cost-effective AI workstation? (Cooling, power supply, compatibility, etc.)
4. What’s your go-to setup for local AI inference? I’d love to hear about your experiences!
I’m all about balancing cost and performance, so any insights or recommendations are hugely appreciated.
Thanks in advance! 🙌
@selfhosted@a.gup.pe #AIServer #LocalAI #BudgetBuild #LLM #GPUAdvice #Homelab #AIHardware #DIYAI #ServerGPU #ThinkStation #UsedTech #AICommunity #OpenSourceAI #SelfHostedAI #TechAdvice #AIWorkstation #LocalAI #LLM #MachineLearning #AIResearch #FediverseAI #LinuxAI #AIBuild #DeepLearning #OpenSourceAI #ServerBuild #ThinkStation #BudgetAI #AIEdgeComputing #Questions #CommunityQuestions #HomeLab #HomeServer #Ailab #llmlab
A space for Bonfire maintainers and contributors to communicate