The bridge just creates imap/smtp servers, so you should be able to add it to thunderbird on Android.
The bridge just creates imap/smtp servers, so you should be able to add it to thunderbird on Android.
Something to consider is that any given instance can be a bad actor and do whatever the hell they want.
Each person doesn’t need to host everything.
The Internet archive already has torrents that get automatically created, you can right now go and download/seed torrents for some items and you are immediately doing your part in decentralizing the Internet archive.
But that is them accepting it.
Why federated and not just regular p2p?
Internet archive already supports torrents.
People generally don’t like being proselytized.
It’s 4kb it’s the demo scene.
To expand, the rendered to video output is much more than 4k, but the file that produces the output can be small like that, this is usually done by doing a bunch of math to generate the output dynamically.
You can kind of equate it to how a video game can generate 120 frames of 4k footage every second indefinitely, but the game itself is limited in size.
Recording the output takes up space, but you don’t need to record it if you can generate it in demand.
I think text is going to be the most dense, information wise. With plain text you could fit about 2500 average length books in 1gb, that’s not considering any compression.
Additionally, you could create a novel representation of words to reduce the total amount of text and include a key to expand it back out, replacing common groupings of letters like ‘ch’ with ‘k’ for example
If you could get a 2:1 compression ratio from your modified alphabet and a 4:1 compression ratio from traditional compression algorithms you could get up to 20 thousand books! That’s a book a day for 55 years,
I think music is gonna take up way too much space. Compressed all the way down to 32kbps which is going to be a pretty miserable listening experience (everything will sound underwater) you are only going to get ~75 ish hours of music.
Cut that in half for a more tolerable 64kbps.
It’s a decent amount of music, but not a lifetime’s worth of your only entertainment imo.
Edit: for some context on audio, 320kbps mp3 will only net you 7 hours of music.
I mean, how do you think websites work? Of course your mouse and keyboard events are available, otherwise you wouldn’t be able to interact with a website at all.
Proton is not actually sandboxed the way an actual container is.
A) if the program running in proton was given root access in some way, say by tricking people into entering their root password for a claimed update, it would have complete normal control of your entire system just like normal.
B)apps running in proton still have access to the regular file system.
Wine isn’t an emulator or a vm.
Thanks for the breakdown! This is probably the most helpful breakdown I’ve seen of a build like this.
Yea I do, you brought up that local isn’t always the option.
I desperately want it to work for me, i just can’t get it to work without spending thousands of dollars on hardware just to get back to the same experience as having a regular desktop at my desk.
What is the cost of the thin clients and are you doing this over copper?
Are your desks multi monitor? To get the bare minimum in my households scenario I would need at least 12 streams at greater than 1080p
For 5 seats how much did it cost versus just having a computer in each location? For example looking at hdbaset to replace just my desk setup, I would need 4 ~$350 devices, just looking at monoprice for an idea (https://www.monoprice.com/product?p_id=21669) which doesn’t even cover all of the screens in my office.
Right, but who has the resources to rent compute with multiple GPUs, this is a gaming setup, not office work, and the op was talking about racking it.
All of those services offer an inferior experience to being at the hardware, it’s just not the same experience. Seriously, try it with multiple 1440p 144hz displays, it just doesn’t happen work out well, you are getting a compromised product for a higher cost. You need a good GPU (or at least a way to decode multiple hvec streams) in in the client, and so, you can run a standard thin client.
‘low latency’ is a near native experience, I’m talking, you sit down at your desk and it feels like you are at your computer(as to say, multiple monitors, hdr, USB swapping, Bluetooth, audio, etc, all working seamlessly without noticeably diminished quality), anything less isn’t worth it, since you can just, use your computer like normal.
A display port to fiber extender is $2,000. The fiber is not for the network.
Moonlight does not do what I want, moonlight requires a GPU on the thin client to decode. You would need a high end GPU to decide multiple high resolution video streams. Also afaik, moonlight doesn’t support multiple displays.
Can this solution deliver 3+ streams of high resolution (1440p or higher and 144fps) low latency video with no artifacting and near native performance and responsiveness?
Gaming has a high requirement for high fidelity and low latency I/O, no one wants to spend all this money on racks and thin clients, the then get laggy windows and scrolling, artifacts, video compression, and low resolution.
That’s the problem at hand with a gaming server, if you want to replace a gaming desktop with a vm in a rack, you need to actually get the I/O to the user somehow, either through dedicated cables from the rack, fiber, or networking, the first is impractical, it involves potentially 100ft long runs of multiple display port, HDMI, USB, etc, and is very rigid in its application, the second is very expensive, shooting the price up to thousands of dollars per seat for display port/USB over fiber extenders, and the third option I have yet to see a vnc/remote solution that can deliver near native video performance.
I should reiterate, the op wants to do fidelity sensitive tasks, like video editing, they don’t just need to work on a spreadsheet.
None of the presented solutions cover the aspect of being in a different place than the rack, the same network is fine, but at a minimum a different room.
How do you deliver high resolution (e.g. 1440p, 144 fps) to multiple monitors with low latency over a network? I haven’t seen anything like that accomplished without running fiber from the host.
Eventually, your thin client will need too much power anyway, making the costs rise a lot. It makes sense in an office where you have 500 seats and you can load balance resources.
If someone can show me a multi seat gaming server that has native remote performance (as in you drag windows around in 144 fps, not the standard artifacty high latency behavior of vnc) I’ll eat a shoe.
Yes smartphones and tablets have replaced desktops for most general users.
This is something people fail to realize, and I think part of it is because Linux people tend to surround themselves with other Linux people.
I have been helping my friend get into Linux, we picked a sensible distro, fedora, with the default gnome spin. He loves the UI, great.
But there is a random problem with his microphone, everything is garbled, I can’t recreate it on my hardware and it’s unclear.
He reads guides and randomly inputs terminal commands, things get borked, he re installs, cycle continues.
He tries a different distro, microphone works, but world of Warcraft is funky with lutris, so no go.
The result is, all of this shit just works on windows, and it just doesn’t on Linux. Progress has been made in compatibility, but, for example, there was a whole day of learning just about x vs Wayland and not actually getting to use the computer. For someone who has never opened a terminal before, something as simple to you and I as adding a package repo is completely gibberish
Yes you can learn all of this, but to quote this friend who has been trying Linux for the past two weeks “I’m just gonna re install windows and go back to living my life after work”
When you have 20 years of understanding windows, you need to be nearly 1 to 1 with that platform to get people to switch.
Cli doesn’t make much sense to me either when the *arr suite has a well documented rest API already.