BTW the reason dl.google.com rewrite was faster was not because it was in Go, it was because the C++ server was serving off its local disk and the rewrite was serving off a cluster file system with ~infinite I/O capabilities. Apples and oranges.
BTW the reason dl.google.com rewrite was faster was not because it was in Go, it was because the C++ server was serving off its local disk and the rewrite was serving off a cluster file system with ~infinite I/O capabilities. Apples and oranges.
Deleted Comment
But simply, this floors me. I looked at their costs and with some developer muscle, you could find savings such as:
- Move Fastly to Cloudflare. They don't expressly say what the cost is for that. But moving to CF would eliminate it.
- Move Heroku to Digital Ocean. It's not difficult to create a fully redundant solution.
- Move from Imgix to having Golang micro services which handle the resizing of images and use something like Belugacdn for the CDN. Beluga is $5k a month for a PB. (Or some other cheaper CDN if you don't like Beluga, but damn... imgix pricing)
I'm pretty sure that a savvy CTO could save at least $50k a month with a well designed project that does this over many months and achieves the same result and keeps the redundancy concerns for the small team.
I do realise however, why the team have done this. In the same position (with very few resources) I would probably have done the same. But damn, when a cost of a service gets up to a years salary ($120k) for a good developer. Time to seek alternatives.
With regards to the ads, nothing is changing in terms of ad privacy/tracking or our privacy policy in general. Ads should just become more relevant."
Have you noticed "improving.duckduckgo.com", the analytics service that logs all requests ?
"To be clear, this means we cannot ever tell what individual people are doing since everyone is anonymous"
Oh, except the IP address, right. Fun fact (experiment now removed): https://web.archive.org/web/20180910042004im_/http://image.b...
Does Google have multiple "Deletion policies" such that deleting data from i.e. your GCP bucket follows one policy, and the "scrubbing" described in this article follows an entirely different policy? If so, do different deletion policies have different processes and different audit trails such that the end "deleted" state is subjective and controlled by the engineering and managerial oversight of the engineering/leadership team of that given product(s)?
From my (naive) opinion, it must be really, really, hard to for example, retrain every ML model that a now deleted datapoint ever touched. Its hard too to believe that, at some high level in Alphabet's org, there is no motivation to have the positive PR of feature(s) like this, but still at essence not delete the parts of the data trail that significantly drive Google's revenue. Do these datapoints significantly impact Google's revenue?
https://www.usenix.org/conference/srecon18asia/presentation/...
You can see there's a section on privacy and deleted data as well.
Each team has its own policies, because each product is different: at a bare minimum they might be using different storage systems, but it's very likely that their data pipelines are quite different, too. In any case, each team's targets are at least as strict as any published ones, of course.
Unless they're audited by a source that can be trusted and have the findings made public, I will not believe it either.
I bet it's not free to run, but it's cheaper and easier than elsewhere, because Google's infrastructure is built in-house and mostly integrated. I don't envy other companies that want to do the same.
Healthy Workers is a Amsterdam-based startup that measures thing such as: air quality, CO2 and consented employee data (e.g. their sleep and focus) and makes an analysis what parts of the building have an unconductive work environment and how this can be improved.
Conference rooms with bad air are the first problem they look at.
They are hiring for a head of sales and a product designer: https://healthyworkers.recruitee.com/
That's basically because Larry Page really, really cares about it. He's kinda like your friend with a Kubrick obsession that can't stop bringing up facts:
https://twitter.com/elonmusk/status/727189428142235648
He was on to something! Jokes aside, I think he just has a heightened sense of smell and that's why he had air filters stronger than law requirements installed everywhere, at least in Mountain View.