Microsoft leaks 38TB of private data via unsecured Azure storage::The Microsoft AI research division accidentally leaked dozens of terabytes of sensitive data starting in July 2020 while contributing open-source AI learning models to a public GitHub repository.

  • pavnilschanda@lemmy.world
    link
    fedilink
    English
    arrow-up
    46
    arrow-down
    2
    ·
    1 year ago

    This will definitely make customers less trustful of Microsoft when dealing with their privacy-focused AI projects. Here’s to hoping that open-source LLMs become more advanced and optimized.

    • Tatters@feddit.uk
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      1 year ago

      I am not sure. This was mostly a case of human error in not properly securing urls/storage accounts. The lack of centralised control of SAS tokens that the article highlights was a contributing factor, but not the root cause, which was human error.

      If I leave my front door unlocked and someone walks in and robs my house, who is to blame? Me, for not locking the door? Or the house builder, for not providing a sensor so I can remotely check whether the door is locked?

      • NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        1 year ago

        If I leave my front door unlocked and someone walks in and robs my house, who is to blame?

        In a private environment, one person’s mistake can happen, period.

        A corporate environment absolutely needs robust procedures in place to prevent the company and all their clients from such huge impact of one person’s mistake.

        But that’s a looong tradition at M$ - not having it, I mean.

      • Sethayy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        if you live in an apartment and the landlord doesnt replace the front door locks when they break is a better analogy