Neuromatch is at #SfN2025 🎉
Come by booth 3704 to learn about:
🌟 Our virtual, paid TA opportunities
🌟 Online courses in #ComputationalNeuroscience, #DeepLearning, and #NeuroAI
🌟 Mentoring opportunities
🌟 Our Impact Scholar Program
#Tag
Neuromatch is at #SfN2025 🎉
Come by booth 3704 to learn about:
🌟 Our virtual, paid TA opportunities
🌟 Online courses in #ComputationalNeuroscience, #DeepLearning, and #NeuroAI
🌟 Mentoring opportunities
🌟 Our Impact Scholar Program
What an incredible first day at #SfN2025!
Find us in the Nonprofit Area, Booth #3704
If you haven’t yet, come by to:
⭐ Learn about our Computational Neuroscience courses
⭐ Learn about TA and mentoring opportunities
⭐ Chat with the CEO and Academy Program Director
See you at the booth!
#Neuromatch #ComputationalNeuroscience #DeepLearning #NeuroAI #GlobalResearch #EarlyCareerResearchers #STEMEducation #DataScience
What an incredible first day at #SfN2025!
Find us in the Nonprofit Area, Booth #3704
If you haven’t yet, come by to:
⭐ Learn about our Computational Neuroscience courses
⭐ Learn about TA and mentoring opportunities
⭐ Chat with the CEO and Academy Program Director
See you at the booth!
#Neuromatch #ComputationalNeuroscience #DeepLearning #NeuroAI #GlobalResearch #EarlyCareerResearchers #STEMEducation #DataScience
Safe travels to everyone in our community heading to San Diego for #SfN2025!
Be sure to come see us at Booth #3704 in the Nonprofit Area!
#Neuromatch #ComputationalNeuroscience #DeepLearning #NeuroAI #ClimateScience #GlobalResearch #EarlyCareerResearchers #STEMEducation #DataScience #SfN
When Luai first joined Neuromatch Academy’s #DeepLearning Course, he had no experience in deep learning, just curiosity and a willingness to learn. What he discovered changed his path.
➡️ https://www.linkedin.com/feed/update/urn:li:activity:7394405408603271168
#MyNeuromatchStory #STEMEducation #OpenScience #ComputationalNeuroscience
I’m happy to share that I’ve obtained a new certification: AI Engineer for Data Scientists Associate from DataCamp!
#AI #DataScience #MachineLearning #DeepLearning #DataCamp #ArtificialIntelligence #ContinuousLearning
🚨 Ex-OpenAI CTO Mira Murati just launched Tinker, a new service from Thinking Machines Lab.
Tinker strips AI training down to 4 simple functions — you focus on data + algorithms, it handles the GPU chaos.
Is this the Kubernetes moment for AI training?
https://dropletdrift.com/ex-openai-cto-mira-murati-launches-tinker-to-simplify-ai-model-training/
#AI #ArtificialIntelligence #MachineLearning #DeepLearning #AIresearch #LLM #OpenSource #Tech #Innovation #DataScience #NeuralNetworks #FutureOfAI #AIcommunity #AIethics #Startups #OpenAI #Developers #Research #Computing
🚨 Ex-OpenAI CTO Mira Murati just launched Tinker, a new service from Thinking Machines Lab.
Tinker strips AI training down to 4 simple functions — you focus on data + algorithms, it handles the GPU chaos.
Is this the Kubernetes moment for AI training?
https://dropletdrift.com/ex-openai-cto-mira-murati-launches-tinker-to-simplify-ai-model-training/
#AI #ArtificialIntelligence #MachineLearning #DeepLearning #AIresearch #LLM #OpenSource #Tech #Innovation #DataScience #NeuralNetworks #FutureOfAI #AIcommunity #AIethics #Startups #OpenAI #Developers #Research #Computing
Hey everyone 👋
I’m diving deeper into running AI models locally—because, let’s be real, the cloud is just someone else’s computer, and I’d rather have full control over my setup. Renting server space is cheap and easy, but it doesn’t give me the hands-on freedom I’m craving.
So, I’m thinking about building my own AI server/workstation! I’ve been eyeing some used ThinkStations (like the P620) or even a server rack, depending on cost and value. But I’d love your advice!
My Goal:
Run larger LLMs locally on a budget-friendly but powerful setup. Since I don’t need gaming features (ray tracing, DLSS, etc.), I’m leaning toward used server GPUs that offer great performance for AI workloads.
Questions for the Community:
1. Does anyone have experience with these GPUs? Which one would you recommend for running larger LLMs locally?
2. Are there other budget-friendly server GPUs I might have missed that are great for AI workloads?
3. Any tips for building a cost-effective AI workstation? (Cooling, power supply, compatibility, etc.)
4. What’s your go-to setup for local AI inference? I’d love to hear about your experiences!
I’m all about balancing cost and performance, so any insights or recommendations are hugely appreciated.
Thanks in advance! 🙌
@selfhosted@a.gup.pe #AIServer #LocalAI #BudgetBuild #LLM #GPUAdvice #Homelab #AIHardware #DIYAI #ServerGPU #ThinkStation #UsedTech #AICommunity #OpenSourceAI #SelfHostedAI #TechAdvice #AIWorkstation #LocalAI #LLM #MachineLearning #AIResearch #FediverseAI #LinuxAI #AIBuild #DeepLearning #OpenSourceAI #ServerBuild #ThinkStation #BudgetAI #AIEdgeComputing #Questions #CommunityQuestions #HomeLab #HomeServer #Ailab #llmlab
Hoi iedereen! 👋
Vragen aan de community:
Heeft iemand ervaring met deze GPU’s? Welke zou je aanbevelen voor het lokaal draaien van grotere LLMs?
Zijn er andere budgetvriendelijke server-GPU’s die ik misschien heb gemist en die geweldig zijn voor AI-workloads?
Heb je tips voor het bouwen van een kosteneffectieve AI-workstation? (Koeling, voeding, compatibiliteit, enz.)
Wat is jouw favoriete setup voor lokale AI-inferentie? Ik zou graag over jullie ervaringen horen!
Alvast bedankt! 🙌"
#AIServer #LokaleAI #BudgetBuild #LLM #GPUAdvies #ThuisLab #AIHardware #DIYAI #ServerGPU #TweedehandsTech #AIGemeenschap #OpenSourceAI #ZelfGehosteAI #TechAdvies #AIWorkstation #MachineLeren #AIOnderzoek #FediverseAI #LinuxAI #AIBouw #DeepLearning #ServerBouw #BudgetAI #AIEdgeComputing #Vragen #CommunityVragen
Hey everyone 👋
I’m diving deeper into running AI models locally—because, let’s be real, the cloud is just someone else’s computer, and I’d rather have full control over my setup. Renting server space is cheap and easy, but it doesn’t give me the hands-on freedom I’m craving.
So, I’m thinking about building my own AI server/workstation! I’ve been eyeing some used ThinkStations (like the P620) or even a server rack, depending on cost and value. But I’d love your advice!
My Goal:
Run larger LLMs locally on a budget-friendly but powerful setup. Since I don’t need gaming features (ray tracing, DLSS, etc.), I’m leaning toward used server GPUs that offer great performance for AI workloads.
Questions for the Community:
1. Does anyone have experience with these GPUs? Which one would you recommend for running larger LLMs locally?
2. Are there other budget-friendly server GPUs I might have missed that are great for AI workloads?
3. Any tips for building a cost-effective AI workstation? (Cooling, power supply, compatibility, etc.)
4. What’s your go-to setup for local AI inference? I’d love to hear about your experiences!
I’m all about balancing cost and performance, so any insights or recommendations are hugely appreciated.
Thanks in advance! 🙌
@selfhosted@a.gup.pe #AIServer #LocalAI #BudgetBuild #LLM #GPUAdvice #Homelab #AIHardware #DIYAI #ServerGPU #ThinkStation #UsedTech #AICommunity #OpenSourceAI #SelfHostedAI #TechAdvice #AIWorkstation #LocalAI #LLM #MachineLearning #AIResearch #FediverseAI #LinuxAI #AIBuild #DeepLearning #OpenSourceAI #ServerBuild #ThinkStation #BudgetAI #AIEdgeComputing #Questions #CommunityQuestions #HomeLab #HomeServer #Ailab #llmlab
Gary Marcus is onto something in here. Maybe true AGI is not so impossible to reach after all. Just probably not in the near future but likely within 20 years.
"For all the efforts that OpenAI and other leaders of deep learning, such as Geoffrey Hinton and Yann LeCun, have put into running neurosymbolic AI, and me personally, down over the last decade, the cutting edge is finally, if quietly and without public acknowledgement, tilting towards neurosymbolic AI.
This essay explains what neurosymbolic AI is, why you should believe it, how deep learning advocates long fought against it, and how in 2025, OpenAI and xAI have accidentally vindicated it.
And it is about why, in 2025, neurosymbolic AI has emerged as the team to beat.
It is also an essay about sociology.
The essential premise of neurosymbolic AI is this: the two most common approaches to AI, neural networks and classical symbolic AI, have complementary strengths and weaknesses. Neural networks are good at learning but weak at generalization; symbolic systems are good at generalization, but not at learning."
https://garymarcus.substack.com/p/how-o3-and-grok-4-accidentally-vindicated
#AI#NeuralNetworks#DeepLearning#SymbolicAI#NeuroSymbolicAI#AGI
Gary Marcus is onto something in here. Maybe true AGI is not so impossible to reach after all. Just probably not in the near future but likely within 20 years.
"For all the efforts that OpenAI and other leaders of deep learning, such as Geoffrey Hinton and Yann LeCun, have put into running neurosymbolic AI, and me personally, down over the last decade, the cutting edge is finally, if quietly and without public acknowledgement, tilting towards neurosymbolic AI.
This essay explains what neurosymbolic AI is, why you should believe it, how deep learning advocates long fought against it, and how in 2025, OpenAI and xAI have accidentally vindicated it.
And it is about why, in 2025, neurosymbolic AI has emerged as the team to beat.
It is also an essay about sociology.
The essential premise of neurosymbolic AI is this: the two most common approaches to AI, neural networks and classical symbolic AI, have complementary strengths and weaknesses. Neural networks are good at learning but weak at generalization; symbolic systems are good at generalization, but not at learning."
https://garymarcus.substack.com/p/how-o3-and-grok-4-accidentally-vindicated
#AI#NeuralNetworks#DeepLearning#SymbolicAI#NeuroSymbolicAI#AGI
A space for Bonfire maintainers and contributors to communicate