Comment µTorrent est devenu détesté - https://www.youtube.com/watch?v=7hbkgy19Yrg
> Il fut un temps où µTorrent dominait le monde... que s'est-il passé?
#Tag
Comment µTorrent est devenu détesté - https://www.youtube.com/watch?v=7hbkgy19Yrg
> Il fut un temps où µTorrent dominait le monde... que s'est-il passé?
Progress from SciOp on distributing large scientific datasets via #bittorrent and preservation via Lots of Copies Keeps Stuff Safe. This feature allows uploads and torrent modifications via a web interface, so you can participate even if your institution blocks BitTorrent. #OpenData via @jonny
Progress from SciOp on distributing large scientific datasets via #bittorrent and preservation via Lots of Copies Keeps Stuff Safe. This feature allows uploads and torrent modifications via a web interface, so you can participate even if your institution blocks BitTorrent. #OpenData via @jonny
Does anyone know the social history of libtorrent? it now appears only sparsely maintained and still almost exclusively by arvid, really huge bugs go unresponded to, and i'm finding tons of offers to help with stuff also getting just ignored? like how did this whole protocol get built on top of this one library maintained by this one person - asking for any specifics, rather than "that just always happens"
Mastodon Isn’t Complying With Age Verification Because It Can’t
While other services are bending at the knee over age verification, Mastodon has no actual means to comply.
https://www.freezenet.ca/mastodon-isnt-complying-with-age-verification-because-it-cant/
#Censorship#FileSharing#News#Technology#ActivityPub#AgeVerification#BitTorrent#eDonkey2000#FreeSpeech #government#Mastodon#SocialMedia
Mastodon Isn’t Complying With Age Verification Because It Can’t
While other services are bending at the knee over age verification, Mastodon has no actual means to comply.
https://www.freezenet.ca/mastodon-isnt-complying-with-age-verification-because-it-cant/
#Censorship#FileSharing#News#Technology#ActivityPub#AgeVerification#BitTorrent#eDonkey2000#FreeSpeech #government#Mastodon#SocialMedia
Ok, so I'm trying to make a torrent for #Tenacity and seed it. I figure I could create the torrent on one computer, transfer it to my file server via #BitTorrent (where it'll seed there), and then distribute the torrent and magnet link.
The transfer works fine because my file server and computer are on the same networks. However, when I test with my phone or another computer on a different network, it doesn't work. My file server is behind a firewall, but ports 6881-6889 are both open on both my router and firewall on the server.
What am I doing wrong? Is this too complicated?
Have a disused laptop or Raspi? Make it part of the swarm and take the data outside the US (or any) administration's grasp!
#scienceunderattack #bittorrent #decentralizedbackup #libraryofcongress
check this out if you want to help preserve the archive of "most local newspapers through most of US history" that had its funding pulled, even if you only have a couple dozen gigabytes to spare, you can 
a) make an account on https://sciop.net/ , 
b) run a qbittorrent instance, go to preferences>web ui and click enable, 
and just do this
python -m pip install sciop-scrapingsciop-cli loginsciop-cli client addsciop-scrape chronicling-america --next
and that's all.
if you have spare storage, you can sort by seeders, ascending, and start from there. or subscribe to the rss feed and auto-download it.
this is an archive funded by the library of congress (threatened) and the national endowment for the humanities (actively being eliminated). the alternative is that an enormous amount of US history that doesn't percolate into history books is owned and operated by lexisnexis and other for-profit data brokers.
this is the first run of some tooling to lower the bar for participatory scraping - at the moment, the archive is still online, and the scraper will automatically embed a webseed URL in the created torrent. so even if you don't have space to seed, you can scrape the data, upload the torrent, and make it possible for waiting peers to become mirrors
Have a disused laptop or Raspi? Make it part of the swarm and take the data outside the US (or any) administration's grasp!
#scienceunderattack #bittorrent #decentralizedbackup #libraryofcongress
if anyone is bored or wants to contribute to gray archive tech, i've done all the hard parts around this, but here is a set of things you could do to make "practical repack mutability for torrents" happen: https://codeberg.org/Safeguarding/-/projects/19508
so we have an indexer and a cli tool that can interact with clients. if we added one link table that allowed people to declare relationships between torrents - like e.g. if one replaces another, or is an updated version of, successor to, and so on, then one could plug in the pieces so the cli periodically checks for updated versions of torrents and swaps them out in the local client.
this could be your name in the credits: "what if bittorrent trackers weren't just static repositories of torrent files and generic peer connection machines but could facilitate socio-technological resolutions to basic problems in the protocol."
if anyone is bored or wants to contribute to gray archive tech, i've done all the hard parts around this, but here is a set of things you could do to make "practical repack mutability for torrents" happen: https://codeberg.org/Safeguarding/-/projects/19508
so we have an indexer and a cli tool that can interact with clients. if we added one link table that allowed people to declare relationships between torrents - like e.g. if one replaces another, or is an updated version of, successor to, and so on, then one could plug in the pieces so the cli periodically checks for updated versions of torrents and swaps them out in the local client.
this could be your name in the credits: "what if bittorrent trackers weren't just static repositories of torrent files and generic peer connection machines but could facilitate socio-technological resolutions to basic problems in the protocol."
I just closed da loop on the first example of the distributed, bittorrent backed, "archiveteam-warrior except anyone can do it and not necessarily have uploading to archive org being the only means of sharing". .. thing. And it's pretty good. In a couple months we've slowly spread our octopus tentacles over the whole bittorrent/scraping stack and it's cool to start seeing the pieces connect.
We have a basic problem: someone recognizes a dataset will disappear, then we would have to have a whole convoluted forum process where we post what we're taking, blah blah heroism, volunteerism, solving hard problems on the fly, love this group. Except that sometimes only half the data would show up, or it would end up with one person needing to seed 20TB from their home connection. So uneven labor distribution
Anyway once we get the frontend and docs writ we'll have a sketch of an idea: what if not just distributing the data, we also distributed the scraping. making it possible for deduplicated, distributed crawl tasks that feed automatically back into feeds of torrents. Once we convince arvindn to make webtorrent enabled by default, we've got some cool news with the replayweb.page folks to share. Along with being able to share the work and mutually validating snapshots of the web, that's your distributed wayback machine.
Then it's time to start the federation part where it gets really interesting - making AP groups that can mutually coordinate archival work and publish it in private, overlapping torrent bubbles
Edit: here is the thing in code form, docs are the cherry on top: https://codeberg.org/Safeguarding/-/projects/18027
#sciop#Bittorrent#BittorrentIsStillHappening#ProtocolEvolutionEvenIfItsALittleCrusty#SeriouslyTheresSoMuchOpenSpaceInBittorrent
I just closed da loop on the first example of the distributed, bittorrent backed, "archiveteam-warrior except anyone can do it and not necessarily have uploading to archive org being the only means of sharing". .. thing. And it's pretty good. In a couple months we've slowly spread our octopus tentacles over the whole bittorrent/scraping stack and it's cool to start seeing the pieces connect.
We have a basic problem: someone recognizes a dataset will disappear, then we would have to have a whole convoluted forum process where we post what we're taking, blah blah heroism, volunteerism, solving hard problems on the fly, love this group. Except that sometimes only half the data would show up, or it would end up with one person needing to seed 20TB from their home connection. So uneven labor distribution
Anyway once we get the frontend and docs writ we'll have a sketch of an idea: what if not just distributing the data, we also distributed the scraping. making it possible for deduplicated, distributed crawl tasks that feed automatically back into feeds of torrents. Once we convince arvindn to make webtorrent enabled by default, we've got some cool news with the replayweb.page folks to share. Along with being able to share the work and mutually validating snapshots of the web, that's your distributed wayback machine.
Then it's time to start the federation part where it gets really interesting - making AP groups that can mutually coordinate archival work and publish it in private, overlapping torrent bubbles
Edit: here is the thing in code form, docs are the cherry on top: https://codeberg.org/Safeguarding/-/projects/18027
#sciop#Bittorrent#BittorrentIsStillHappening#ProtocolEvolutionEvenIfItsALittleCrusty#SeriouslyTheresSoMuchOpenSpaceInBittorrent
Update. Speak up to help #AED reduce the odds that valuable public US govt datasets will be taken down. 
https://essentialdata.us/
"Demonstrating the broad real-world value of federal data is the most strategic path to ensuring its continued flow. The goal of America's Essential Data is to make it easy for: … federal agency data stewards and their leadership to better understand the true value of their data, especially as it relates to administration priorities. Do you use a federal dataset that delivers important benefits for the American people? Help us tell the story of that dataset!"
#Censorship #DefendResearch #OpenData #Takedowns #Trump #USPol #USPolitics
Update. "#SciOp is part of Safeguarding Research & Culture (#SRC). The bits must flow: let us resurrect the ancient art of #Bittorrent to ensure that our cultural, intellectual and scientific heritage exists in multiple copies, in multiple places, and that no single entity or group of entities can make it all disappear."
https://sciop.net/ 
#Censorship #DefendResearch #OpenSource #Preservation #Takedowns #Trump #USPol #USPolitics
here is the part where i would usually say all the stuff i always say about how the only survivable archive is a distributed archive, and about how they won't be able to kill our ability to understand reality or our cultural memory, but i am tired and satisfied for a minute. you know the deal
special invite to any #BitTorrent heads out there who want to contribute to our uploaders guide, seeders guide, torrent creation guide, or want to help get the tracker scraping and announcing part off the ground. if you ever wondered what it would be like if there were better social overlays to bittorrent where trackers could announce around peers based on explicit social relationships, curate collections collaboratively across tracker instances... that's the kinda sicko shit we're into
A space for Bonfire maintainers and contributors to communicate