- cross-posted to:
- privacy@lemmy.ml
- privacy@lemmy.ml
- cross-posted to:
- privacy@lemmy.ml
- privacy@lemmy.ml
ah yes, the other TCP
Tasty Consensual Photos
Maybe they should patent it, to protect their TCP IP.
Or have some higher tier version called Ultimate Cookie Protection {UDP)
LOL
Id prefer a security security oriented Secure Cookie Total Protection (SCTP)
Wouldn’t that be Ultimate Dookie Protection?
danvit, yes
Aren’t cookies already limited to the site at which they were created??
What the fuck? You mean to tell me sites have been sharing cookies?
I thought all browsers only delivered cookies back to the same site.
NO.
https://en.m.wikipedia.org/wiki/Third-party_cookies
Maybe it’s not allowed in your local jurisdiction? But it’s been a problem since forever.
I know Facebook and Reddit are in cahoots.
I went to visit Reddit a couple weeks back to read the Deadpool & Wolverine comments, but used the wrong container tab and now Facebook feeds me endless Marvel related stuff.
A lot of it is culture war bullshit too. Hmmmmm 🤔
No, you don’t know anything. Just because you have a suspicion because something happened to you once doesn’t mean you are sure in any way.
Nah I’m sure.
I never once saw a post about Marvel fed to me by Facebook and now it’s constant
Did it start after the extremely popular marvel movie “Deadpool and Wolverine” released?
Lol that’s your argument for why you think they don’t know what they’re talking about? Because all you did is make yourself seem like you have no idea how cookies work 🤣
It’s just one single person who noticed something once.
That’s an awful awful sample size absolutely filled with bias and thought fallacies.
Before. “Couple weeks” is more like 5 months at this point now that I think about it.
I don’t mind some of it much, but the obvious culture-war bait is infuriating.
It’s not because the movie just came out. I’ve been diligent about keeping Facebook and Reddit in their container tabs for years. It’s just Marvel stuff, not just the movies. Marvel’s been putting out huge movies for years and this hasn’t happened around any of their other releases.
Why aren’t you just using the official automatic Facebook container?
I made it before that was a thing. Habit I guess. 🤷♂️
The problem is that a website is generally not served from one domain.
Put a Facebook like button on your website, it’s loaded directly from Facebook servers. Now they can put a cookie on your computer with an identifier.
Now every site you visit with a Facebook like button, they know it was you. They can watch you as you move around the web.
Google does this at a larger scale. Every site with Google ads on it. Every site using Google analytics. Every site that embeds a Google map. They can stick a cookie in and know you were there.
Is this also how they know which ads to feed you?
Yes, it’s the reason for the tracking. To sell more targeted ads.
If you’re up for reading some shennanigans, check out the book Mindf*ck. It’s about the Cambridge Analytica scandal, written by a whistleblower, and details election manipulation using data collected from Facebook and other public or purchased data.
Is that because the like button is an iframe?
It doesn’t have to be. Your browser sends the cookies for a domain with every request to that domain. So you have a website example.com, that embeds a Facebook like button from Facebook.com.
When your browser downloads the page, it requests the different pieces of the page. It requests the main page from example.com, your browser sends any example.com cookies with the request.
Your browser needs the javascript, it sends the cookie in the request to get the JavaScript file. It needs the like button, it sends a request off to Facebook.com and sends the Facebook.com cookies with it.
Note that the request to example.com doesn’t send the cookies for Facebook.com, and the request to Facebook.com doesn’t send the cookie for example.com to Facebook. However, it does tell Facebook.com that the request for the like button came from example.com.
Facebook puts an identifier in the cookie, and any request to Facebook sends that cookie and the site it was loaded on.
So you log in to Facebook, it puts an identifier in your cookies. Now whenever you go to other sites with a Facebook like button (or the Facebook analytics stuff), Facebook links that with your profile.
Not logged in? Facebook sets an identifier to track you anyway, and links it up when you make an account or log in.
How is Facebook able to know what site is requesting it? Is it in the referer header, or is it parameters in the javascript/image url?
There is a referer header sent, but depending on the exact code added to the page, it’s very likely they are loading a snippet of JavaScript that lets them collect other information and trigger their own sending of information to their server.
For example, Google Analytics has javascript added to the page, but loading fonts from Google’s CDN (which many sites do) will rely on the referer.
Thank you for the explanation!
Put a Facebook like button on your website, it’s loaded directly from Facebook servers. Now they can put a cookie on your computer with an identifier.
Which is not allowed by GDPR btw, because they do that even if you don’t click them. There are plenty guides online, how to create your own, not tracking facebook like button.
How does GDPR fit in to Google Analytics and personalised ads?
I would have thought it went something like: random identifier: not linked to personal info, just a collection of browsing history for an unidentified person, not under GDPR as not personal info.
Link to account: let them request deletion (or more specifically, delinking the info from your account is what Facebook lets you do), GDPR compliant.
Both Google and Facebook run analytics software that tracks users. I presume letting people request deletion once it’s personally linked to them is probably what let’s them do it? But I don’t live in a GDPR country, so I don’t know a whole lot about it.
No, it should’ve been opt-in. But loophole with “vital interest” and politics being slow and surface-level like politics.
Yup. Nobody else gets those cookies.
This is old news, from 2022!!
From the blog post:
“June 14, 2022”
“Updated Aug. 28, 2024”
“And starting in 2024, all our users can look forward to Firefox blocking even more third party cookies.”
Except it’s still out of date because it mentions chrome also blocking third party cookies when at this point in time they’ve announced that they’ve abandoned that course of action now.
Chrome, I’m looking at you. When are you getting it?
Google recently cancelled their 3rd party cookie plan because they realized its not gonna work for their data harvesting goals
Never, because Chrome is a data harvesting platform.
Made by an advertising company
Get fucked, advertisers.
There is still plenty of fish for advertisers, sadly.
Advertisers track you with device fingerprinting and behaviour profiling now. Firefox doesn’t do much to obscure the more advanced methods of tracking.
EU outlaws it
The EU isn’t the only place on the planet, even if its laws have an impact.
Don’t all the advanced ways rely on JavaScript?
Lots do. But do you know anyone that turns JS off anymore? Platforms don’t care if they miss the odd user for this - because almost no one will be missed.
I go hard with DNS-based ad blocking and I’m constantly confirming it works by checking the network tab in developer tools. I’m basically only seeing first party scripts and CDN assets — 99% of websites don’t host tracking garbage themselves.
Pihole?
It’s a common solution but I do something more involved and manual, but it’s the same concept.
Is it something you can talk about? I’m currently in the process of trying to switch from pihole to pfblockerng but am interested if there are better alternatives
I use LibreJS with few exceptions. If I need to use a site that requires non-free JavaScript, I’ll use a private browsing window or (preferably) Tor Browser.
“Anymore”? I’ve never met a single soul who knows this is even possible. I myself don’t even know how to do it if I wanted to.
I do use NoScript, which does this on a site-by-site basis, but even that is considered extremely niche. I’ve never met another NoScripter in the wild.
Am I in the wild? I use it.
They probably mean in the flesh.
I don’t really talk about it in meat space, so they just might not have known.
The people who I’ve tried to get on NoScript seem to have the brain capacity of goldfish. If the site doesn’t instantly work, it’s as if the sky has fallen and there is no way to convince them to pay attention to which scripts are actually needed.
It’s a rare breed that is willing to put up with toggling different scripts on and off. I’ll also acknowledge that too many people (including me) are in a giant rush. For work-type stuff, I have the laptop without noscript, because sometimes I do need something to work absolutely right now.
You don’t think you are being a tad judgemental?
People whose lives revolve around fashion probably think you dress like shit.
People who love food probably think you eat like shit.
People who love cars probably think you are a shit driver.
You probably love computers and care about privacy, and you are shitting on regular users(assumption, admittedly) for not being invested.
They had something that was working, you present noscript, thing no longer works. If you are not invested, how are you going to see the appeal of extra work?
Well, you know what they say. You can lead a horse to water, but you can’t make it interested in learning about the water cycle to have a deeper understanding of why the river flows in the first place.
Why not just use ublock medium mode?
Roughly similar to using Adblock Plus with many filter lists + NoScript with 1st-party scripts/frames automatically trusted. Unlike NoScript however, you can easily point-and-click to block/allow scripts on a per-site basis.
https://github.com/gorhill/uBlock/wiki/Blocking-mode:-medium-mode
uBlock origin + NoScript for me. I deal with the bigger umbrella of scripts with uBlock and then fine tune permissions to the ones that uBlock allowed with NoScript.
They might be fingerprinting me using these two extensions though.
Not all but most, yes. But TBF, sites that still function with JS disabled tend to have the least intrusive telemetry, and might pre-date big data altogether.
Regardless, unless the extent of a page’s analytics is a “you are the #th visitor” counter, all countermeasures must remain active.
Honestly would be hard to do. There a perfectly legitimate and everyday uses for pretty much everything used in fingerprinting. Taking them away or obscuring them in one way or another would break so much.
Librewolf has Resist Fingerprinting which comes pretty far.
Every Librewolf browser uses the same windows user agent, etc. But there are downsides, like time zones don’t work, and sites don’t use dark mode by default.
And even then, EFF’s Cover Your Tracks site can still uniquely identify me, mainly through window size. That’s one of the reasons why Tor Browser uses letterboxing to make the window size consistent.
I don’t know what letterboxing is. But if window size is used to identify me, can’t it be circumvented simply by using the window in restored size, and not maximised?
Your restored window size is even more unique than your maximised window size!
The correct solution is to just not make the window size available to JS or to remotes at all. There’s no reason to ever need specifics on window size other than CSS media-queries, and those can be done via profiles.
But the restored size keeps changing - can’t be profiled, right?
And how do I not make the size available “to JS or to remote”?
Librewolf supports letterboxing as well, though the setting might be disabled by default
It’s really strange how they specifically mention HTML5 canvas when you can run any fingerprinter test on the internet and see that Firefox does nothing to obfuscate that. You can run a test in Incognito mode, start a new session on a VPN, run another test, and on Firefox your fingerprint will be identical.
Well yeah, they’re just blocking known fingerprinting services. If you use a tool that they don’t recognize, it’ll still work, but their approach will still block the big companies that can do the most harm with that data.
The only alternative is probably to disable WebGL entirely, which isn’t a reasonable thing to do by default.
WebGL
I wish Firefox had a per-site or per-domain preference for WebGL (as well as for wasm, etc), the same way we have per-site cookies or notifs preferences. It’d help clear most issues regarding this.
Maybe they should try to develop the uBlock Origin extension with the dev to make it last more.
Article from JUNE 14, 2022
For those who don’t care to read the full article:
This basically just confines any cookies generated on a page, to just that page.
So, instead of a cookie from, say, Facebook, being stored on site A, then requested for tracking purposes on site B, each individual site would be sent its own separate Facebook cookie, that only gets used on that site, preventing it from tracking you anywhere outside of the specific site you got it from in the first place.
I don’t know why this wasn’t the case long ago.
It increases implementation complexity of the browser and loses people who fund Firefox and contribute code $$$
Basically creates a fake VM like environment for each site.
Disabling cross site cookie is already a thing for decades…
Same with Do Not Track requests.
Do Not Track has never really done anything, it just asks websites politely to not track you. There’s no legal or technical limitation here.
I still much rather have it than not. It also lead to the spiritual successor GPC which does actually have regulatory requirements under the CCPA.
Fair. However, it also provides websites with additional information to fingerprint you, so that’s a thing too.
Disabling cross site cookies and allowing them to exist while siloed within the specific sites that need them are two different things.
Previous methods of disabling cross site cookies would often break functionality, or prevent a site from using their own analytics software that they contracted out from a third party.
Thank you for your explanation, tbat greatly clears up my confusion.
TBH, if a person’s concern is being tracked by, for example, Facebook; then this just lets Facebook continue tracking them without directly allowing Facebook’s anaylitics customers to track them to another site directly (but indirectly that information can still be provided). But I guess for all the people giving FB and Google those proviledges better to have this than not.
Isn’t this basically Firefox’s version of the third party cookie block that Chrome rolled out a few months ago? Or am I missing something here?
I mean, it’s good news either way but I just want to know if this is somehow different or better.
Sites are much more contained now. Is much more like a profile per site.
For those who don’t care to read the full article
Or even the whole title, really
Hahahahaha so it doesn’t break anything that still relies on cookies, but neuters the ability to share them.
That’s awesome
I would love to see an icon of a neutered cookie please 🥺😄.
Honestly, I thought that’s how it already worked.
Edit: I think what I’m remembering is that you can define the cookies by site/domain, and restrict to just those. And normally would, for security reasons.
But some asshole sites like Facebook are making them world-readable for tracking, and this breaks that.
Total Cookie Protection was already a feature, (introduced on Feb 23st 2021) but it was only for people using Firefox’s Enhanced Tracking Protection (ETP) on strict mode.
They had a less powerful third-party cookie blocking feature for users that didn’t have ETP on strict mode, that blocked third party cookies on specific block lists. (i.e. known tracking companies)
This just expanded that original functionality, by making it happen on any domain, and have it be the default for all users, rather than an opt-in feature of Enhanced Tracking Protection.
That’s not what I was thinking of, which was even more fundamental. But that’s good info (and another way to cover stuff in the article).
Edit: what I was thinking originally was really stupid, that 3rd-party cookies weren’t allowed at all. Which was really dumb since of course they are.
No, you weren’t far off. A single site can only get and set cookies on its domain. For example, joesblog.com can’t read your Facebook session cookie, because that would mean they could just steal your session and impersonate you.
But third-party cookies are when joesblog.com has a Facebook like button on each post. Those resources are hosted by Facebook, and when your browser makes that request, it sends your Facebook cookies to Facebook. But this also lets Facebook know which page you’re visiting when you make that request, which is why people are upset.
With this third-party cookie blocking, when you visit joesblog.com and it tries to load the Facebook like button, either the request or just the request’s cookies will be blocked.
Although that raises an interesting question. Facebook is at facebook.com, but its resources are all hosted under fbcdn.com. Have they just already built their site to handle this? Maybe they just don’t strictly need your facebook.com cookies to load scripts, images, etc. from fbcdn.com.
They’ve been doing this with container tabs, so this must be the successor to that idea (I’m going to assume they’ll still have container tabs).
Container tabs are still a thing in FF. This is based on that work, if I remember correctly.
I love container tabs. It’s one of the reasons I went back to FF.
Same, they’re an absolute game changer for me. I have to use multiple different identities in work due to separate active directories and container tabs makes it super easy
Container tabs are still useful, as they let you use multiple Cookie jars for the same site. So, it is very easy to have multiple accounts on s site.
From my experience, blocking 3rd party cookies in general doesn’t seem to make any difference for site functionality anyways. Though I never log into sites with a Google or FB account other than Google or FB sites (and rarely at all for the latter).
Unless that cookie was somehow important for you to use both sites, but thats incredibly rare.
Forgive me if this is an overly simplistic view but if the ads with cookies are all served on Google’s platform say then would all those ads have access to the Google cookie jar?
If they don’t now then you can bet they are working on just that.
They are usually separate things. Cookies are produced/saved locally, to be read in the next visit (by the same website or maany websites basically forever unless you use firefox containers or at least clear them once in a while). There’s also local storage which is different but can also be used to identify you across the web. Ads, trackers, all of these categories are often made of many small components: you read a single article on a “modern” newspaper website, hundreds of connection are being made, different tiny scripts or icons or images are being downloaded (usually from different subdomains for different purposes but there’s no hard rule). It’s possible to block one thing and not another. For example I can block Google Analytics (googletagmanager) which is a tracker, but accept all of Google’s cookies.
The way I’m reading it, they allow the third party cookies to be used within the actual site you’re on for analytics, but prevent them from being accessed by that third party on other sites.
But I just looked at the linked article’s explanation, and not a technical deep dive.
So that’s what third party cookies are. What this does is make it so that when you go to example.com and you get a Google cookie, that cookie is only associated with example.com, and your random.org Google cookie will be specific to that site.
A site will be able to use Google to track how you use their site, which is a fine and valid thing, but they or Google don’t get to see how you use a different site. (Google doesn’t actually share specifics, but they can see stuff like “behavior on one site led to sale on the other”)
We’ll have to see what happens but what you are talking about is what Mozilla calls Third-Party Cookies and… they are aware of it.
I can’t entirely tell if that means they will be put in the facebook cookie jar or if it will be put in the TentaclePorn Dot Org (don’t go there, it is probably a real site and probably horrifying) cookie jar. If the former? Then only facebook themselves have that which… is still a lot better I guess? If the latter then that is basically exactly what we all want but a lot of sites are gonna break (par for the course with Firefox but…).
TentaclePorn Dot Org (don’t go there, it is probably a real site and probably horrifying)
It’s registered through namecheap and points to cloudflare, but there’s nothing behind cloudflare. It just times out. That was disappointing.
The cookie would go to the Facebook or tentacleporn cookie jar depending on which site the user has actually visited. Whatever the domain in the address bar says.
InB4 the guy who replies to defend tenticle porn…
Alright fine ill switch browsers AGAIN
Let me guess, itll still let websites see a list connected microphones and cameras with zero user interaction?
Trying
navigator.mediaDevices.enumerateDevices() .then(function(devices) { devices.forEach(function(device) { console.log(device.kind + ": " + device.label + " id = " + device.deviceId); }); })
it appears to have no label and the ids are randomly generated per site.
So it still ahows the number of devices then?
I’m curious how this will affect OAuth (if at all). Does it use an offsite cookie to remember the session, or is that only created after it redirects back to the site that initiated the login?
I my experience it generally breaks it. Leveraging cookies on the auth domain is fine, but once you are redirected to another domain, that application needs to take the access and refresh tokens and manage reauthentication as a background process. Simply don’t store those things as cookies though.
I was also wondering that
Good to see Firefox still has value to provide
Firefox is awesome.
Does this stop me from adding to my website an iframe to facebook where facebook can keep its cookies for my user? That would be great but I doubt it.
I haven’t worked with HTML since 1999; I hate that I’m just now finding out that iframes are somehow still a thing in the modern world. What the actual fuck. Why? Don’t we have some fancy HTML5 or Ajax or something that can replace them?
Yeah i don’t know why, probably exactly because is such a neglected feature that it offers workarounds for some limitations, like in the case of cookie-related patterns.
IIRC an iframe contents is treated as a separate window, so cookies aren’t shared either
That’s horrific WHY?
do not add any event listeners for
message
events. This is a completely foolproof way to avoid security problems. 🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡🤡
Sure, but the separate window can be on a different domain. Now you have a way to share cookies across multiple websites on different domains if all of them include an iframe to this external domain. And you can use in-browser messages (see window.postMessage()) to communicate between iframes and main window.
Indeed see sibling comment https://programming.dev/comment/11983146