I’m the administrator of kbin.life, a general purpose/tech orientated kbin instance.

  • 0 Posts
  • 4 Comments
Joined 2 years ago
cake
Cake day: June 29th, 2023

help-circle

  • We do run .deb/.rpm files from random websites though.

    In general with Linux sites with deb/rpm/etc files would usually include hashes for the genuine versions etc. Not to say the actual author of these could be malicious.

    And you mentioned flatpak too. Appimage is quite popular too, and afaik that doesn’t have any built-in sandboxing at all.

    Even with sandboxing, they generally need access to save files/load files etc from the host environment. Where are these connections defined? Could a malicious actor for example grant their malicious appimage/flatpak more access? Genuine questions, I’ve never looked into how these work.


  • I think there’s a few aspects to this whole subject.

    First of all for a long time people have thought Linux not to be the target of malware. I would say that it has been a target and it has been for decades. I recall in the late 90s a Linux server at work was attacked, had a rootkit, IRC trojan and attack kit installed by script kiddies in Brazil. I think the nearest you can say is that desktop users aren’t usually a target, which is mostly true. But with the share of desktop installs hitting a high recently we should expect that to change.

    Second I think most windows antivirus products (including the built in one) are doing some active useful things. Most of these are not relevant on Linux (we generally don’t run setup.exe from random websites). However! Here’s where things get interesting. The rise of flatpak and other containerised applications. These I would say are very similar to setup.exe, and would make it trivial to embed malware into such a file. A Linux virus scanner could be checking these. Also we’ve seen direct attacks on distro repositories lately. I don’t expect this to slow down. We are most certainly a target now.

    Third, the other reason most Linux users don’t use virus scanners is because they’re usually technical people who would recognise (usually) something wrong and investigate/spot the malware. I would say two things are changing here. Simpler to install distros are bringing in less technical people to Linux and, the number of processes running on a machine doing effectively nothing in a desktop environment is way higher than it used to be. So technical people can be caught off guard. Also, a rootkit can hide all of these clues if done well.

    So I would say there’s a really good space to have a well made virus scanner/antivirus now. It is probably the right time for it.


  • I think my question on all this would be whether this would ultimately cause problems in terms of data integrity.

    Currently most amplifiers for digital information are going to capture the information in the light, probably strip off any modulation to get to the raw data. Then re-modulate that using a new emitter.

    The advantages of doing this over just amplifying the original light signal are the same reason switches/routers are store and forward (or at least decode to binary and re-modulate). When you decode the data from the modulated signal and then reproduce it, you are removing any noise that was present and reproducing a clean signal again.

    If you just amplify light (or electrical) signals “as-is”, then you generally add noise every time you do this reducing the SNR a small amount. After enough times the signal will become non-recoverable.

    So I guess my question is, does the process also have the same issue of an ultimate limit in how often you can re-transmit the signal without degradation.