Readit News logoReadit News
disconnected commented on Facebook content policies tweaked over time to accommodate POTUS   washingtonpost.com/techno... · Posted by u/thelock85
justapassenger · 6 years ago
Free press is surprisingly supportive of an idea of a private corporations policing and setting standards of content and political discussions.

I hate how discussions look online about politics, but I worry that instead of pushing for accountability of public figures, through democratic tools and institutions, articles like that focus on trying to get a private police up and going.

Facebook, Twitter, Google, etc censoring more and more content from public figures won’t change the fact that USA have a racist president and that tons of people support him.

And while I think it’s a valid question how much of their racist and discriminatory content they should be allowed to say before they’re thrown out of office, it shouldn’t be decided by Silicon Valley.

disconnected · 6 years ago
The "free press" doesn't have a "get out of jail free" card of being able to claim that they are just a "platform" and be able to dodge all responsibility for what they print like Facebook does.

If the press writes something objectively false about someone, they are starting at a defamation lawsuit. If they write something that gets someone killed, they will be staring at possible criminal liability. If they print, I dunno, a picture of a child having sex, there'll be hellfire and brimstone (both legal and social).

And no, saying "ha ha, it was just an opinion column" doesn't save them.

"Press" has perks but also has responsibilities. The free press is "ok" with these rules because they've had to follow them for decades. What they want is a level playing field.

disconnected commented on How to Write a Video Player in Less Than 1000 Lines (2015)   dranger.com/ffmpeg/ffmpeg... · Posted by u/selvan
umvi · 6 years ago
"How to write a json parser in 2 lines of Python"

    import json
    json.loads('{"That": "was easy"}')
(To be fair, knowing ffmpeg exists doesn't mean I'd be able to easily write a video player with it without a lot of research. For that reason I find this tutorial is still quite interesting and valuable.)

disconnected · 6 years ago
> (To be fair, knowing ffmpeg exists doesn't mean I'd be able to easily write a video player with it without a lot of research. For that reason I find this tutorial is still quite interesting and valuable.)

The tutorial is still 1000 lines of C, involving SDL, ffmpeg and threads.

Yikes :)

Last time I did something like this (a UI that among other things, played a live stream from a cheapo IP camera), I used python and the libvlc python bindings. Creating a bare bones media player with it was trivial and it worked very well for what I needed. The only complaint I had at the time was that the documentation was absolutely terrible. A cursory look today reveals that they seem to have improved it, so yay?

And yes, the end result in my case came out MUCH smaller than 1000 lines (but lines of code is a shitty metric anyway).

If you "just" need a media player, IMHO, libvlc is not an awful option (in Python, at least - I have no experience with other bindings): https://wiki.videolan.org/LibVLC/

disconnected commented on The Rust Compilation Model Calamity   pingcap.com/blog/rust-com... · Posted by u/WTTT
adamnemecek · 6 years ago
You have a monorepo with 2 million lines of code? How long is the compilation supposed to take?
disconnected · 6 years ago
I don't know how long it is "supposed" to take, but you can compile a Linux kernel (27.8M lines of code, though a good chunk of it probably isn't going to be compiled anyway because you don't need it, and another good chunk is architecture specific) in under 10 minutes on relatively modest (but modern) hardware.

On the other hand, something like Chromium (25M lines of code) will take about 8 hours, and bring your machine to its knees as it consumes ALL available resources (granted, last I did this I only had 8GB of RAM, and I was running my desktop at the time... including Chromium). I don't remember exactly how long Firefox takes to build, but I remember it was significantly less time (maybe 3 hours?).

So... it depends? On a lot of things?

(btw, LoC numbers were pulled from the first legitimate looking result I could find on a quick search... take with a grain of salt... also, compilation times are a rough approximation based on my observations... that it with a truckload of salt)

disconnected commented on Facebook PHP Source Code from August 2007   gist.github.com/nikcub/38... · Posted by u/patrickdevivo
shawnz · 6 years ago
Are you a lawyer?
disconnected · 6 years ago
You don't have to be a lawyer to understand that subtraction and multiplication are two completely different operations.
disconnected commented on Every Google result now looks like an ad   twitter.com/craigmod/stat... · Posted by u/cmod
kllrnohj · 6 years ago
> This is part of Google's attempt to de-prioritise the URL.

URLs have always been an implementation detail and not a user feature. From the very beginning it was intended that users would follow links, not type in URLs. HTML was built on hiding URLs behind text. Then AOL keywords happened. Then search explosion happened. And short URLs. And QR codes for real-world linking. And bookmarks because yet again typing in URLs is not a major driving use case.

Typing in un-obfuscated URLs has almost never been a key feature or use-case of the web. If anything URL obfuscation is a core building block of the web and is a huge reason _why_ the web skyrocketed in popularity & usage. Don't pretend that somehow AMP obfuscating URLs will be the death of the web. The web exploded in growth despite massive, wide-spread URL obfuscation over the last 20 years. Nothing is actually changing here.

disconnected · 6 years ago
The "web" is built around "human readable" technologies. Even actual implementation details that the user doesn't care about - like the application layer protocol (HTTP) and the source code for pages (HTML, CSS) - is human readable.

The "point" of the web was to serve humans, not machines. If we wanted to serve machines, we'd just throw binary blobs around, which would be orders of magnitude more efficient.

That said, I still have a bunch of "ancient" tech magazines that had directories of URLs for (then) popular websites, grouped by category. That's how we found things then.

People forget that there was a world before Google.

disconnected commented on Software certifications; a waste of time and money (2018)   tomaytotomato.com/certs-w... · Posted by u/mothsonasloth
disconnected · 6 years ago
Software certification matters in certain industries.

In automotive and airspace, you will find that being certified opens a bunch of doors for you.

From a company's perspective, these certification don't PROVE that you know what you are doing, but at least they prove that you aren't completely in the dark with regards to all the annoying processes that exist in the industry (stuff like MISRA-C, Automotive SPICE and so on). Or at the very least, you aren't lying when you say that you've TOTALLY heard of them, honest (un-shockingly, people lie on their resumes, even about things that can be easily checked).

This might not get you hired, but it will make them consider you.

disconnected commented on Mozilla lays off 70   techcrunch.com/2020/01/15... · Posted by u/ameshkov
throwaway123x2 · 6 years ago
It's crazy how much FF's marketshare has dropped. It's such a great browser.
disconnected · 6 years ago
The decline of Firefox is largely due to Mozilla's incompetence.

If you look at their own statistics [1], you will notice that (worldwide) usage started declining after the introduction of Quantum (November 2017), and dropped substantially after the April 2019 addons outage.

This would suggest, IMHO, that Firefox loses market share when beloved features - customization, addons - stop working or got worse.

This is unsurprising to anyone who has been using Firefox for a long time: the primary differentiation factor of Firefox has always been customization. Messing with that would always be a risky proposition. Nuking the whole ecosystem entirely and starting over in the space of a few of months was simply idiotic. Disabling everyone's addon's because of an admin screw up was just the icing on the fucking cake.

[1] https://data.firefox.com/dashboard/user-activity

disconnected commented on Tesla's stock just hit a record $420   cnn.com/2019/12/23/invest... · Posted by u/berbec
martythemaniak · 6 years ago
The stock is blazing!

Weed jokes aside, the market has finally caught up to the fact that Tesla is years ahead of their competitors. I bought my Model 3 18 months ago and it felt like buying a car from the future. I think my feeling was correct.

2019 saw a number of serious competitors (Audi, MB, VW, Porche) release their first EVs and they've all been somewhat disappointing. Very solid first releases taken in isolation, but the market expectations were that the "Big Boys" were coming in and were going to wipe the floor with Tesla. Turns out they couldn't, Tesla's head start is real and consequential.

If you go out and actually try to put down $40-50k on a car and judge all EVs on a collection of attributes (range, efficiency, safety, charging options, tech, etc) Tesla clearly stands out. It's tough to convince yourself to put down a nearly-equivalent amount of hard-earned money for another brand, so it is no coincidence that they outsell other EVs by a very wide margin.

disconnected · 6 years ago
> the market has finally caught up to the fact that Tesla is years ahead of their competitors.

Incorrect. The stock value is being pushed up by:

1. Unexpected quarterly profits in October;

2. Tesla planning to open a new factory in China and securing a 1.4 Billion dollar loan to do so;

3. Apparent strong interest in the Cybertruck and the impending announcement of the Model Y;

4. USA and China are easing the trade war, which is pushing the whole stock market up (there have been "records" left and right in the past few days).

But the big one is number 2. Tesla moving into China means that they can tap into the humongous Chinese market.

disconnected commented on GitHub Actions is my new favorite free programming tool [video]   bytesized.xyz/github-acti... · Posted by u/kmf
penagwin · 6 years ago
Thanks for the warning, are there any good tools that integrate with pull request checks that can be self hosted?

We self host nearly everything at work except for github. (And it's VERY difficult for me to get anything that costs money approved regardless of price)

disconnected · 6 years ago
> Thanks for the warning, are there any good tools that integrate with pull request checks that can be self hosted?

Recently at work we've done an analysis of a of CI/CD tools. Many of them are self-hosted, free (and open source) and support a variety of workflows - including pull request related checks (either by polling of via webhooks). A cursory search will yield you a lot to play with, so... go ahead and do that.

That said, if you don't want to think about it too much, you can't go wrong with Jenkins.

Some people will suggest GitLab. I'd steer clear of GitLab, though, because of their operational incompetence [1] and their funny ideas about mandatory corporate espionage [2] - I mean, telemetry - which they only backed away from because people yelled at them. That second one was enough to disqualify them in our analysis (the first one just cemented the idea).

[1] https://about.gitlab.com/blog/2017/02/10/postmortem-of-datab...

[2] https://www.theregister.co.uk/2019/10/30/gitlab_backtracks_o...

disconnected commented on Httpserver.h: Single header library for writing non-blocking HTTP servers in C   github.com/jeremycw/https... · Posted by u/jeremycw
celticmusic · 6 years ago
dpkg -i deb_file_I_downloaded_off_the_interwebz

phew, that vendor lockin that includes tools to install whatever I want!

disconnected · 6 years ago
sigh

That random file you downloaded off the internet was built under a specific set of assumptions - assumptions that only hold true if you are running the specific OS version they were targeting.

IF you download the .deb file for your specific OS, and IF you manually install all missing dependencies, then it works. Otherwise, you are still screwed.

At least you can extract it (IIRC, it's just a zip file anyway). But that's no different from going to sourceforge or github or whatever and getting the source tarball... and we are back where we started.

By the way, I was not complaining about "vendor lock-in". I was complaining about Debian's package management policies and how they can affect your software development process in practice - to make the case that apt is a crap replacement for a proper language/library/development/whatever oriented package management.

u/disconnected

KarmaCake day642October 12, 2016View Original