this post was submitted on 30 Dec 2024
83 points (81.2% liked)

Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ

55195 readers
411 users here now

⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules • Full Version

1. Posts must be related to the discussion of digital piracy

2. Don't request invites, trade, sell, or self-promote

3. Don't request or link to specific pirated titles, including DMs

4. Don't submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder

📜 c/Piracy Wiki (Community Edition):


💰 Please help cover server costs.

Ko-Fi Liberapay
Ko-fi Liberapay

founded 2 years ago
MODERATORS
 

Hello everyone,

I recently came across an article on TorrentFreak about the BitTorrent protocol and found myself wondering if it has remained relevant in today's digital landscape. Given the rapid advancements in technology, I was curious to know if BitTorrent has been surpassed by a more efficient protocol, or if it continues to hold its ground (like I2P?).

Thank you for your insights!

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 23 points 5 days ago (4 children)

A better question is, what would you improve over current way that torrents work.

[–] [email protected] 14 points 5 days ago (2 children)

I wish there were some way to enable availability to persist even when torrents' peak of popularity has passed - some kind of decentralized, self-healing archive where a torrent's minimal presence on the network was maintained. Old torrents then could become slow but the archival system would prevent them being lost completely, while distributing storage efficiently. Maybe this isn't practical in terms of storage, but the tendency of bittorrent to lose older content can be frustrating.

[–] [email protected] 17 points 5 days ago

I don't see what you can do at the protocol level to improve availability, you still need people storing the file and acting as peers. Some trackers try to improve that by incentivizing long term seeding.

[–] [email protected] 5 points 5 days ago (1 children)

It’s called private trackers, and they are great.

[–] [email protected] 5 points 5 days ago (1 children)

Meh.. I get itchy when I hear private. We could also improve the experience for seeding publicly and for longer. Not only by education but maybe even using some kind of intensive to keep seeding.

[–] [email protected] 3 points 5 days ago

The issue is that public trackers are too easy for people to monitor and pursue copyright infringement claims for. Private trackers, by design, are much harder to do that with, which makes them leaps and bounds safer to use.

Don’t think about it as keeping the common man out, it is about keeping The Man out.

[–] [email protected] 10 points 5 days ago (2 children)

A better question is; What would you change in the current Internet/WWW to make it as decentralized as Torrents are?

[–] [email protected] 7 points 5 days ago (5 children)

I wish there was a decentralised way of hosting websites. Kind of like torrents.

[–] [email protected] 7 points 5 days ago (2 children)

Sounds like maybe what you're looking for is ipfs? https://ipfs.tech/

[–] [email protected] 5 points 5 days ago

Problem with IPFS, is that it's not really that decentralized as I wish it was. Since by default the data is not shared across the network, meaning if nobody is downloading and hosting that node, you are still the only one having a copy of the data. Meaning if your connection is gone or if you get censored, there is no other node where the IPFS data is living. It only works if somebody else is activily downloading the data.

Ow, and then you also need to Pin the content, or the data will be removed again -,-

Furthermore, the look-up via DHT is very slow and resolving the data is way too slow in order to make sense. People expect today max 1 or 2 seconds look-up time + page load would result in 4 or 5 seconds.. Max... However with IPFS this could be 20, 30 seconds or even minutes...

[–] [email protected] 0 points 5 days ago* (last edited 5 days ago) (1 children)

That’s just for files though. Imagine a specific decentralised protocol for hosting websites.

You can technically host a website on IPFS but it’s a nightmare and makes updating the website basically impossible 2021 wikipedia IPFS Mirror. A specific protocol would make it far more accessible.

[–] [email protected] 6 points 5 days ago (1 children)

Websites are just files. For something like running a site on ipfs, you'd want to pack everything into a few files, or just one, and serve that. Then you just open that file in the browser, and boom, site.

I'm not really sure it qualifies as a web site any more at that point, but an ipfs site for sure. Ipfs has links, right?

[–] [email protected] 2 points 5 days ago (1 children)

With LibreWeb I tried to go this route, using IPFS protocol. But like I mention above, IPFS is not as decentralized by design as people might think. People still need to download the content first and hosting a node.. And then ALSO pin the content.. It's not great. And look-up takes way too long as well with their DHT look-up.

[–] [email protected] 2 points 5 days ago (1 children)

Well... it's not really designed for that use case, so yeah you'll have to deal with issues like that. For interplanetary file transfers, that's acceptable.

[–] [email protected] 1 points 5 days ago (1 children)

I'm searching for better alternatives, ideas are welcome.

[–] [email protected] 1 points 4 days ago (1 children)

Probably the closest thing would be an activitypub blog or static site service.

[–] [email protected] 1 points 4 days ago

ActivityPub is still using centralized DNS. I'm talking about a decentralized Web. And no, activitypub doesn't scale as good.

[–] [email protected] 6 points 5 days ago (2 children)

I'm personally trying to fix it.. https://libreweb.org. Still a proof of concept though.

[–] [email protected] 3 points 5 days ago

Looks really cool. Thanks for the share

[–] [email protected] 0 points 4 days ago (1 children)

Why MIT license and not something like GPLv3?

[–] [email protected] 0 points 4 days ago (1 children)

MIT license is more permissive.

[–] [email protected] 1 points 4 days ago (1 children)

Yeah but then companies can use your work and not provide compensation. But to each their own.

[–] [email protected] 2 points 2 days ago

Yes that is true.

[–] [email protected] 2 points 4 days ago

That would be very cool, I know we have onion sites that operate on the Tor network that use keypairs for the domains, but the sites themselves are still centrally hosted by a person, anonymously hosted but still centrally hosted.

[–] [email protected] 2 points 4 days ago* (last edited 4 days ago)

There is actually a JS library called Planktos that can serve static websites over BitTorrent. I don't know how good it is, but it sounds like a starting point.

https://github.com/xuset/planktos

[–] [email protected] 3 points 5 days ago (1 children)

There's some cryptobro projects about sticking distributed file sharing on top of ~ THE BLOCKCHAIN ~.

I'm skeptical, but it might actually be a valid use of such a thing.

[–] [email protected] 4 points 5 days ago

Blockchain is a nice technology, but not all the solutions need blockchain technology. Just like BitTorrent doesn't require blockchain, a decentralized internet alternative also doesn't need blockchain.

[–] [email protected] 3 points 5 days ago

The profit motive

[–] [email protected] 5 points 5 days ago (2 children)

Make mutable torrents possible.

[–] [email protected] 8 points 5 days ago (1 children)

What’s the advantage to that? I don’t want the torrent I’m downloading to change.

[–] [email protected] 3 points 5 days ago (2 children)

I want that. For example you downloaded debian iso version 13 and after some time it can be updated to 13.1. Obviously it shouldn't be an automatic operation unless you allowed it before starting download.

[–] [email protected] 12 points 5 days ago* (last edited 5 days ago)

I wouldn’t call that mutable, more like version tracking in which each torrent is aware of future versions.

I kind of like that, but you might be able to accomplish it with a plugin or something.

Put a file in the torrent called “versions” or something like that, and in there would be a url that the client can use to tell you if there is a new version.

It wouldn’t change the protocol though, since the new version and old version would still need to be separate entities with different data and different seeding.

[–] [email protected] 4 points 5 days ago

Like the 13.1 torrent being only a patch to the 13 one and listing it as a dependency? Downloading the 13.1 torrent would transparently download the 13 if it wasn't already, then download the 13.1 patch and apply it. But I don't think any of this needs to be at the protocole level, that's client functionality.

[–] [email protected] 4 points 5 days ago

Resilio sync can do this, I'm pretty sure.

Although if implemented as an extension to BitTorrent, I'd want it to be append-only, because I don't want to lose 1.0 just because 1.1 becomes available.

[–] [email protected] 4 points 5 days ago

The last 0.01 percent comes in at the same speed as the rest of it