Discussion
Loading...

#Tag

  • About
  • Code of conduct
  • Privacy
  • Users
  • Instances
  • About Bonfire
Evan Prodromou boosted
Mallory Knodel
@mallory@techpolicy.social  ·  activity timestamp 3 days ago

Apply Now for the @internetsociety 2026 Pulse Research Fellowship and Mentorship https://pulse.internetsociety.org/blog/apply-now-for-the-2026-pulse-research-fellowship-and-mentorship

#Reddit #tiktok #x #disinformation #FacebookFediverse #bereal #Twitter #socialmediause #socialmedia #twitter #mastodon #threads #facebook #pinterest #instagram #Facebook #meta #algorithms #newsmast

  • Copy link
  • Flag this post
  • Block
Mallory Knodel
@mallory@techpolicy.social  ·  activity timestamp 3 days ago

Apply Now for the @internetsociety 2026 Pulse Research Fellowship and Mentorship https://pulse.internetsociety.org/blog/apply-now-for-the-2026-pulse-research-fellowship-and-mentorship

#Reddit #tiktok #x #disinformation #FacebookFediverse #bereal #Twitter #socialmediause #socialmedia #twitter #mastodon #threads #facebook #pinterest #instagram #Facebook #meta #algorithms #newsmast

  • Copy link
  • Flag this post
  • Block
Hacker News
@h4ckernews@mastodon.social  ·  activity timestamp 6 days ago

Algorithms for Optimization [pdf]

https://algorithmsbook.com/optimization/files/optimization.pdf

#HackerNews #Algorithms #Optimization #PDF #MachineLearning #DataScience

  • Copy link
  • Flag this post
  • Block
Newsmast Foundation boosted
Mallory Knodel
@mallory@techpolicy.social  ·  activity timestamp 2 weeks ago

Trying out the @newsmast client! Adds another layer to picking a server and making an account, as an additional tailoring of the ways I consume, interact with and create content.

#Reddit #tiktok #x #disinformation #FacebookFediverse #bereal #Twitter #socialmediause #socialmedia #twitter #mastodon #threads #facebook #pinterest #instagram #Facebook #meta #algorithms #newsmast

  • Copy link
  • Flag this post
  • Block
Hacker News
@h4ckernews@mastodon.social  ·  activity timestamp last week

Functional Data Structures and Algorithms: a Proof Assistant Approach

https://fdsa-book.net/

#HackerNews #FunctionalDataStructures #Algorithms #ProofAssistant #Programming #HN

  • Copy link
  • Flag this post
  • Block
Mallory Knodel
@mallory@techpolicy.social  ·  activity timestamp 2 weeks ago

Trying out the @newsmast client! Adds another layer to picking a server and making an account, as an additional tailoring of the ways I consume, interact with and create content.

#Reddit #tiktok #x #disinformation #FacebookFediverse #bereal #Twitter #socialmediause #socialmedia #twitter #mastodon #threads #facebook #pinterest #instagram #Facebook #meta #algorithms #newsmast

  • Copy link
  • Flag this post
  • Block
Woozle Hypertwin boosted
Pax Ahimsa Gethen
@funcrunch@me.dm  ·  activity timestamp 2 weeks ago

So apparently there's some trend on #LinkedIn where women have reported getting tons more visibility after changing their listed #gender to male. A Black woman reported trying this and not getting the same results, because racism.

I commented that as a Black trans person I would never change my listed gender for algorithmic gain, and that I prefer posting here on Mastodon where I don't have to worry about said #algorithms.

https://www.linkedin.com/feed/update/urn:li:activity:7397054335463047169?commentUrn=urn%3Ali%3Acomment%3A%28activity%3A7397054335463047169%2C7397339837051248640%29&dashCommentUrn=urn%3Ali%3Afsd_comment%3A%287397339837051248640%2Curn%3Ali%3Aactivity%3A7397054335463047169%29

#BlackMastodon

What happens when a Black woman switches her gender on LinkedIn to “male”?
…apparently not the same thing that happens to white women. ✨

Over the past week, I’ve watched post after post from white… | Cass Cooper, MHR | 234 comments

What happens when a Black woman switches her gender on LinkedIn to “male”? …apparently not the same thing that happens to white women. ✨ Over the past week, I’ve watched post after post from white women saying their visibility skyrocketed the moment they changed their profile gender from woman → man. More impressions. More likes. More reach. 📈 So I tried the same thing. And my visibility dropped. 👀 Here’s why that result matters: these experiments are being treated as if they’re only about gender; in reality they reveal something deeper about race + gender + algorithmic legitimacy. 🔍 A white woman toggling her gender is basically conducting a test inside a system where her racial credibility stays constant. She changes one variable. The algorithm keeps the rest of her privilege intact. 💡 When a Black woman does the same test? I’m not stepping into “white male privilege”; I’m stepping into a category that platforms and society have historically coded as less trustworthy, less safe, or less “professional.” Black + male is not treated the same as white + male. Not culturally. Not algorithmically. 🧩 So while white women are proving that gender bias exists (which is true), they’re doing it without naming the racial insulation that makes their results possible. Meanwhile, Black women and women of color are reminded—again—that we can’t separate gender from race because the world doesn’t separate them for us. 🗣️ This isn’t about placing blame; it’s about widening the conversation so the conclusions match the complexity. 🌍 If we’re going to talk about bias, visibility, and influence online, we cannot pretend we all start from the same default settings. 🔥 I’m curious: Have you run your own experiment with identity signals on this platform? What changed… and what didn’t? 👇🏾 | 234 comments on LinkedIn
  • Copy link
  • Flag this post
  • Block
Pax Ahimsa Gethen
@funcrunch@me.dm  ·  activity timestamp 2 weeks ago

So apparently there's some trend on #LinkedIn where women have reported getting tons more visibility after changing their listed #gender to male. A Black woman reported trying this and not getting the same results, because racism.

I commented that as a Black trans person I would never change my listed gender for algorithmic gain, and that I prefer posting here on Mastodon where I don't have to worry about said #algorithms.

https://www.linkedin.com/feed/update/urn:li:activity:7397054335463047169?commentUrn=urn%3Ali%3Acomment%3A%28activity%3A7397054335463047169%2C7397339837051248640%29&dashCommentUrn=urn%3Ali%3Afsd_comment%3A%287397339837051248640%2Curn%3Ali%3Aactivity%3A7397054335463047169%29

#BlackMastodon

What happens when a Black woman switches her gender on LinkedIn to “male”?
…apparently not the same thing that happens to white women. ✨

Over the past week, I’ve watched post after post from white… | Cass Cooper, MHR | 234 comments

What happens when a Black woman switches her gender on LinkedIn to “male”? …apparently not the same thing that happens to white women. ✨ Over the past week, I’ve watched post after post from white women saying their visibility skyrocketed the moment they changed their profile gender from woman → man. More impressions. More likes. More reach. 📈 So I tried the same thing. And my visibility dropped. 👀 Here’s why that result matters: these experiments are being treated as if they’re only about gender; in reality they reveal something deeper about race + gender + algorithmic legitimacy. 🔍 A white woman toggling her gender is basically conducting a test inside a system where her racial credibility stays constant. She changes one variable. The algorithm keeps the rest of her privilege intact. 💡 When a Black woman does the same test? I’m not stepping into “white male privilege”; I’m stepping into a category that platforms and society have historically coded as less trustworthy, less safe, or less “professional.” Black + male is not treated the same as white + male. Not culturally. Not algorithmically. 🧩 So while white women are proving that gender bias exists (which is true), they’re doing it without naming the racial insulation that makes their results possible. Meanwhile, Black women and women of color are reminded—again—that we can’t separate gender from race because the world doesn’t separate them for us. 🗣️ This isn’t about placing blame; it’s about widening the conversation so the conclusions match the complexity. 🌍 If we’re going to talk about bias, visibility, and influence online, we cannot pretend we all start from the same default settings. 🔥 I’m curious: Have you run your own experiment with identity signals on this platform? What changed… and what didn’t? 👇🏾 | 234 comments on LinkedIn
  • Copy link
  • Flag this post
  • Block
Hacker News
@h4ckernews@mastodon.social  ·  activity timestamp 3 weeks ago

How to identify a prime number without a computer

https://www.scientificamerican.com/article/how-to-identify-a-prime-number-without-a-computer/

#HackerNews #prime #number #math #computer #science #algorithms #education

  • Copy link
  • Flag this post
  • Block
UniversityofGroningenLibrary
@Bibliothecaris@social.edu.nl  ·  activity timestamp 3 weeks ago

New @universityofgroningen #OpenAccess #publication in the spotlight:

➡️ Exploring concept creep: Youth’s portrayal of #ADHD on #TikTok

🔗 https://doi.org/10.1016/j.ssmmh.2025.100489

Read our interview with corresponding author Wietske de Vries:

🔗 https://www.rug.nl/library/open-access/blog/open-access-publication-in-the-spotlight-exploring-concept-creep-youth-s-portrayal-of-adhd-on-ti

#OpenScience #SocialSciences #Behavior #Algorithms #SocialMedia #Misinformation #MentalHealth #Research #PhD

Exploring concept creep: Youth’s portrayal of ADHD on TikTok

screenshot article
screenshot article
screenshot article
  • Copy link
  • Flag this post
  • Block
Hacker News
@h4ckernews@mastodon.social  ·  activity timestamp 3 weeks ago

De Bruijn Numerals

https://text.marvinborner.de/2023-08-22-22.html

#HackerNews #De #Bruijn #Numerals #HackerNews #Programming #Mathematics #Algorithms

de Bruijn Numerals

  • Copy link
  • Flag this post
  • Block
Em :official_verified: boosted
Miguel Afonso Caetano
@remixtures@tldr.nettime.org  ·  activity timestamp last month

"Everyone sharing his or her data to train A.I. is great if we agree with the goals that were given to the A.I. It’s not so great if we don’t agree with these goals; and if the algorithm’s decisions might cost us our jobs, happiness, liberty or even lives.

To safeguard ourselves from collective harm, we need to build institutions and pass laws that give people affected by A.I. algorithms a voice over how those algorithms are designed, and what they aim to achieve. The first step is transparency. Similar to corporate financial reporting requirements, companies and agencies that use A.I. should be required to disclose their objectives and what their algorithms are trying to maximize — whether that’s ad clicks on social media, hiring workers who won’t join unions or total deportation counts.

The second step is participation. The people whose data are used to train the algorithms — and whose lives are shaped by them — should help decide their goals. Like a jury of peers who hear a civil or criminal case and render a verdict together, we might create citizens’ assemblies where a representative randomly chosen set of people deliberates and decides on appropriate goals for algorithms. That could mean workers at a firm deliberating about the use of A.I. at their workplace, or a civic assembly that reviews the objectives of predictive policing tools before government agencies deploy them. These are the kinds of democratic checks that could align A.I. with the public good, not just private power.

The future of A.I. will not be decided by smarter algorithms or faster chips. It will depend on who controls the data — and whose values and interests guide the machines. If we want A.I. that serves the public, the public must decide what it serves."

https://www.nytimes.com/2025/11/02/opinion/ai-privacy.html?unlocked_article_code=1.yU8.8BEa.DltbW_WwVhxN&smid=nytcore-android-share

#AI #Algorithms #Privacy #DifferentialPrivacy #AITraining

https://www.nytimes.com

Opinion | How A.I. Can Use Your Personal Data to Hurt Your Neighbor

  • Copy link
  • Flag this post
  • Block
Miguel Afonso Caetano
@remixtures@tldr.nettime.org  ·  activity timestamp last month

"Everyone sharing his or her data to train A.I. is great if we agree with the goals that were given to the A.I. It’s not so great if we don’t agree with these goals; and if the algorithm’s decisions might cost us our jobs, happiness, liberty or even lives.

To safeguard ourselves from collective harm, we need to build institutions and pass laws that give people affected by A.I. algorithms a voice over how those algorithms are designed, and what they aim to achieve. The first step is transparency. Similar to corporate financial reporting requirements, companies and agencies that use A.I. should be required to disclose their objectives and what their algorithms are trying to maximize — whether that’s ad clicks on social media, hiring workers who won’t join unions or total deportation counts.

The second step is participation. The people whose data are used to train the algorithms — and whose lives are shaped by them — should help decide their goals. Like a jury of peers who hear a civil or criminal case and render a verdict together, we might create citizens’ assemblies where a representative randomly chosen set of people deliberates and decides on appropriate goals for algorithms. That could mean workers at a firm deliberating about the use of A.I. at their workplace, or a civic assembly that reviews the objectives of predictive policing tools before government agencies deploy them. These are the kinds of democratic checks that could align A.I. with the public good, not just private power.

The future of A.I. will not be decided by smarter algorithms or faster chips. It will depend on who controls the data — and whose values and interests guide the machines. If we want A.I. that serves the public, the public must decide what it serves."

https://www.nytimes.com/2025/11/02/opinion/ai-privacy.html?unlocked_article_code=1.yU8.8BEa.DltbW_WwVhxN&smid=nytcore-android-share

#AI #Algorithms #Privacy #DifferentialPrivacy #AITraining

https://www.nytimes.com

Opinion | How A.I. Can Use Your Personal Data to Hurt Your Neighbor

  • Copy link
  • Flag this post
  • Block
alcinnz boosted
Bits
@bits@mastodon.online  ·  activity timestamp last month

At the end you use `git bisect`
(this is especially useful in messy codebases with no test coverage)

https://kevin3010.github.io/git/2025/11/02/At-the-end-you-use-git-bisect.html

#programming #git #blog #algorithms

Kevin Jivani

At the end you use git bisect

People rant about having to learn algorithmic questions for interviews. I get it — interview system is broken, but you ought to learn binary search at least.
  • Copy link
  • Flag this post
  • Block
Bits
@bits@mastodon.online  ·  activity timestamp last month

At the end you use `git bisect`
(this is especially useful in messy codebases with no test coverage)

https://kevin3010.github.io/git/2025/11/02/At-the-end-you-use-git-bisect.html

#programming #git #blog #algorithms

Kevin Jivani

At the end you use git bisect

People rant about having to learn algorithmic questions for interviews. I get it — interview system is broken, but you ought to learn binary search at least.
  • Copy link
  • Flag this post
  • Block
Cory Doctorow boosted
six_grandfathers_mountain
@six_grandfathers_mountain@mastodon.social  ·  activity timestamp last month

@mweston @mike805 @kathhayhoe @pluralistic
RE
Ban #advertising, collection of data about users, #algorithms....

Well, that would be nice, but we just gotta find the good in the #enshitification and the #PlatformDecay

Oct 31 2025
#CoryDoctorow joins #StephanieRuhle to discuss his new book, "Enshittification, Why Everything Suddenly Got Worse and What to Do About It"

⭕see next post for the 7min VIDEO

Cory Doctorow on MSNBC Oct 31 2025 talking about platfoirm decay. He calls this enshittification
Cory Doctorow on MSNBC Oct 31 2025 talking about platfoirm decay. He calls this enshittification
Cory Doctorow on MSNBC Oct 31 2025 talking about platfoirm decay. He calls this enshittification
  • Copy link
  • Flag this post
  • Block
six_grandfathers_mountain
@six_grandfathers_mountain@mastodon.social  ·  activity timestamp last month

@mweston @mike805 @kathhayhoe @pluralistic
RE
Ban #advertising, collection of data about users, #algorithms....

Well, that would be nice, but we just gotta find the good in the #enshitification and the #PlatformDecay

Oct 31 2025
#CoryDoctorow joins #StephanieRuhle to discuss his new book, "Enshittification, Why Everything Suddenly Got Worse and What to Do About It"

⭕see next post for the 7min VIDEO

Cory Doctorow on MSNBC Oct 31 2025 talking about platfoirm decay. He calls this enshittification
Cory Doctorow on MSNBC Oct 31 2025 talking about platfoirm decay. He calls this enshittification
Cory Doctorow on MSNBC Oct 31 2025 talking about platfoirm decay. He calls this enshittification
  • Copy link
  • Flag this post
  • Block
six_grandfathers_mountain
@six_grandfathers_mountain@mastodon.social  ·  activity timestamp last month

@mweston @mike805 @kathhayhoe @pluralistic
RE
Ban #advertising, collection of data about users, #algorithms....

Well, that would be nice, but we just gotta find the good in the #enshitification and the #PlatformDecay

Oct 31 2025
#CoryDoctorow joins #StephanieRuhle to discuss his new book, "Enshittification, Why Everything Suddenly Got Worse and What to Do About It"

⭕see next post for the 7min VIDEO

Cory Doctorow on MSNBC Oct 31 2025 talking about platfoirm decay. He calls this enshittification
Cory Doctorow on MSNBC Oct 31 2025 talking about platfoirm decay. He calls this enshittification
Cory Doctorow on MSNBC Oct 31 2025 talking about platfoirm decay. He calls this enshittification
  • Copy link
  • Flag this post
  • Block
Hacker News
@h4ckernews@mastodon.social  ·  activity timestamp last month

Rotating Workforce Scheduling in MiniZinc

https://zayenz.se/blog/post/rotating-workforce-scheduling/

#HackerNews #RotatingWorkforce #Scheduling #MiniZinc #Optimization #Algorithms #WorkforceManagement

  • Copy link
  • Flag this post
  • Block
Aral Balkan
@aral@mastodon.ar.al  ·  activity timestamp 2 months ago

Algorithmed (v): to be conditioned into thinking a certain way by algorithms that filter your reality in line with the goals of those who author them.

e.g., “They were algorithmed into thinking that way about trans people.”

#algorithmed #algorithms #truth #reality #tech #filters #BigTech #SiliconValley #technoFascism

  • Copy link
  • Flag this post
  • Block
Log in

bonfire.cafe

A space for Bonfire maintainers and contributors to communicate

bonfire.cafe: About · Code of conduct · Privacy · Users · Instances
Bonfire social · 1.0.1-alpha.7 no JS en
Automatic federation enabled
  • Explore
  • About
  • Members
  • Code of Conduct
Home
Login