In June, the Texas Department of Public Safety (DPS) signed an acquisition plan for a 5-year, nearly $5.3 million contract for a controversial surveillance tool called Tangles from tech firm PenLink, according to records obtained by the Texas Observer through a public information request. The deal is nearly twice as large as the company’s $2.7 million two-year contract with the federal Immigration and Customs Enforcement (ICE).

Tangles is an artificial intelligence-powered web platform that scrapes information from the open, deep, and dark web. Tangles’ premier add-on feature, WebLoc, is controversial among digital privacy advocates. Any client who purchases access to WebLoc can track different mobile devices’ movements in a specific, virtual area selected by the user, through a capability called “geofencing.” Users of software like Tangles can do this without a search warrant or subpoena. (In a high-profile ruling, the Fifth Circuit recently held that police cannot compel companies like Google to hand over data obtained through geofencing.) Device-tracking services rely on location pings and other personal data pulled from smartphones, usually via in-app advertisers. Surveillance tech companies then buy this information from data brokers and sell access to it as part of their products.

WebLoc can even be used to access a device’s mobile ad ID, a string of numbers and letters that acts as a unique identifier for mobile devices in the ad marketing ecosystem, according to a US Office of Naval Intelligence procurement notice.

Wolfie Christl, a public interest researcher and digital rights activist based in Vienna, Austria, argues that data collected for a specific purpose, such as navigation or dating apps, should not be used by different parties for unrelated reasons. “It’s a disaster,” Christl told the Observer. “It’s the largest possible imaginable decontextualization of data. … This cannot be how our future digital society looks like.”

Archived at https://web.archive.org/web/20240827115133/https://www.texasobserver.org/texas-dps-surveillance-tangle-cobwebs/

        • anarchrist@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 months ago

          Sorry if there was confusion. My main point: leave your narc device at home when doing crimes. Have a good day!

          • BrianTheeBiscuiteer@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            Correlation is not causation. This only indicates a person is in the general area [during a crime] and not that they perpetrated it. People go to jail, wrongfully, with less evidence than this.

            • Angry_Autist (he/him)@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              they’re not going to use it as evidence in an arrest, they are going to use it to target social dissidents, which in Texas’s case, is everyone who isn’t a fascist.

              They know they can’t use it as evidence, but they also know ten thousand other cruel and vicious things they can get away with.

            • anarchrist@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              2 months ago

              No, also probably when the AI pattern matches your behavior to a criminal’s behavior because you live in the same neighborhood.

              Again I’m not saying this isn’t bad, I’m saying Texas has no idea what they bought or how to use it. The only practical way to use it is the way the feds do, and if they try the AI shit it will likely fuck them legally speaking at the federal level OR orange Julius wins and the NSA starts just giving this shit to Texas, so this will all be moot.

              I think they got grifted out of $5 mil by AI hucksters.

              • NocturnalMorning@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                2 months ago

                I don’t think you have any idea what you’re talking about. This is exactly the kind of thing AI is good at, pattern recognition.

                • BrianTheeBiscuiteer@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  2 months ago

                  Good at avoiding false-negatives, not so good at avoiding false-positives. IMHO a 1% false-positive rate is unacceptable when the result is ruining someones life.

                • anarchrist@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  2 months ago

                  OK explain then. The AI flags you as a criminal and the cops give you a ticket for looking like a crim? The burden of proof is on the state. Now they have more 2x more shit to investigate which means more cop hours. Idk like I also hate the privacy aspect it this but it seems like a boondoggle that will also waste lots of taxpayer money and it would be good to attack it from two rhetorical angles.

                  • NocturnalMorning@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    0
                    ·
                    2 months ago

                    The burden of proof is on the state

                    Do you like know anything about how our system works? People get slapped with frivolous tickets and lawsuits every day, and cops don’t have to deal with shit from it.

                • Angry_Autist (he/him)@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  2 months ago

                  Because of stupid fucking memes and whiny furry ‘artists’ all of lemmy thinks the greatest danger of AI is someone not getting paid for their drawn porn getting scraped.

                  The REAL danger is AI can piece together nearly every aspect of your schedule, personality, income, pregnancy status, class, social circle, race, and medical history just by correlating anonymous data.

                  It’s already happening, hell it already happened 15 years ago and now they are just that much better.

                  But every FUCKDAMN top comment in this thread is a fucking joke or sarcasm

    • Angry_Autist (he/him)@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Almost no one in this thread cares and they are all memeing like this is an ‘ow my balls’ clip.

      Frankly I’m starting to think we deserve this