Are these institutions that are holding Tesla stock on behalf of mutual funds or ETF's that they then resell to the public? Or are these direct corporate investments? Or something else?
More generally, who votes the shares that are held by these institutions? Does it differ depending on the purpose for which these shares are held? If it's held as part of a fund, do the owners of that fund get to decide, or does the management?
I can't even sign in with my own notion's account
Link to audiobooks: https://marhamilresearch4.blob.core.windows.net/gutenberg-pu...
There's not a lot of writing about this. Folks seem content to fight Airflow's deficiencies. Most of them are too young to know any better. The critiques you'll find are generally written by competitors, or folks adopting a competitor.
Here's the big one I see lots of folks get wrong: do NOT run your code in Airflow's address space.
Airflow was copied from Facebook Dataswarm and comes with a certain set of operational assumptions suitable for giant companies. These assumptions are, helpfully, not documented anywhere. In short, it is assumed that Airflow runs all the time and is ~never restarted. It is run by a team that is different from the team that uses it. That ops team should be well-staffed and infinitely-resourced.
Your team is probably not like that.
So instead of deploying a big fleet of machines, you are probably going to do a simple-looking thing: make a docker container, put Airflow in it, then add your code. This gives you a single-repo, single-artifact way of deploying your Airflow stuff. But, since that's not how Airflow was designed to work, you have signed yourself up for a number of varieties of Pain.
First, you are now very tightly coupled to Airflow's versioning choices. Whatever version of Python runs Airflow, runs your code. Whatever versions of libraries Airflow uses, you must use. This is bad. At one point I supported a data science job that used a trained model serialized with joblib. That serialization was coupled to Python 3.6 and some precise version of SciKitLearn. We wanted to upgrade Python! We couldn't! Don't use PythonOperator. You need separation between Airflow itself and Your Code. Use a virtualenv, use another container, use K8s if you must, but please please do not run your own code INSIDE Airflow.
Second, you cannot deploy without killing jobs. Airflow's intended "deployment" mechanism is "you ship DAG code into the DAG folder via mumble mumble figure it out for yourself". The docs are silent. It is NOT intended that you ship by balling up this mega-container, terminating the Airflow that's running, and starting up the new image. You can do this, to be sure. But anything running will be killed. This will be fine right up until it isn't. Or maybe not, maybe for you it'll be fine forever, but just please please realize that as far as Airflow's authors are concerned, even though they didn't say so, you are Doing It Wrong.
https://news.ycombinator.com/item?id=44469124
https://news.ycombinator.com/item?id=44439088
https://news.ycombinator.com/item?id=44467685