• HeyJoe@lemmy.world
      link
      fedilink
      English
      arrow-up
      27
      ·
      1 day ago

      300tb is a lot, but its kind of crazy to think this entire company only needs 300tb storage arrays to function. I wonder how they handle things internally. I would imagine at least 1 backup server ready to go in HA. I wonder if they have multiple regions across the country that also serves up the same setup.

      • da_cow (she/her)@feddit.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        This isnt the entirety of Spotify. If they would have archived everything in 160kbps OGG Vorbis it would have been 700+TB. Theres A bucket load of songs that literally no one listens to.

      • rainwall@piefed.social
        link
        fedilink
        English
        arrow-up
        19
        ·
        21 hours ago

        Likely cloned Netflix’s “netflix in a box” design, where they drop a large 200TB+ NAS in thousands of different CDN datecenters with their most popular content cached so that total traffic is minimal across the internet at large.

        Spotify mainly being music with very little video likely makes this even easier.

      • JohnEdwa@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        9
        ·
        23 hours ago

        IIRC there’s still like 700TB of low popularity music missing, but it is only something like 0.4% of listens.
        And they need a more storage overall because they have to set up datecenters around the world - doesn’t make sense to stream tens of millions of connections across the ocean. But that also gives all the backups one would need for “free”.

      • 🦄🦄🦄@feddit.org
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 day ago

        Afaik 300 TB is just the most popular music and around a third of all tracks. The blog post on anna’s is quite entertaining tho.

        • HeyJoe@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 day ago

          Oh I know, I work in the industry as well. Our company backups alone for workstations and servers is just under 1 petabyte. This is then replicated to an offsite location which is also out disaster recovery location, and also stored in long term storage in Azure. This is just backups, sooo much money for backups haha. Thats why I am shocked that this entire company can run off of 300tb which is a lot, but nothing when you think of it being the entire business model for them.

          I think the craziest thing ive seen is we have these instruments that do genome testing and sequencing and they would create like 10tb worth of data per month. Every month they got there own 10tb drive handed to them to backup their stuff on there own on top of the ones we did for them.

          • Valmond@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            9 hours ago

            I worked with visualisation of scientific data, up to 1petabyte, multi channel 3D realtime visu without degradation. One client had 1.5TB ram. Interesting times.