Readit News logoReadit News
subsystem commented on Disqus cracked – Security flaw reveals users’ e-mail addresses   cornucopia-en.cornubot.se... · Posted by u/SuperChihuahua
draugadrotten · 12 years ago
If a political organisation was revealing the identities behind anonymous speech on a jewish forum, the world would be up in arms. If the identities on a gay board was published, Obama himself would be apologising. Now the identities of thousands of people commenting on politics in Sweden was revealed, and it's OK because "they" are the bad guys, says the extreme left organisation Researchgruppen.

The slippery slope is

subsystem · 12 years ago
Press freedom generally trumps privacy, as it should.
subsystem commented on Doom’s Creator Looks Back on 20 Years of Demonic Mayhem   wired.com/gamelife/2013/1... · Posted by u/adventured
rpm4321 · 12 years ago
This was interesting:

"The worst aspect of the continuing pace of game development that we fell into was the longer and longer times between releases. If I could go back in time and change one thing along the trajectory of id Software, it would be, do more things more often. And that was id’s mantra for so long: 'It’ll be done when it’s done.' And I recant from that. I no longer think that is the appropriate way to build games. I mean, time matters, and as years go by—if it’s done when it’s done and you’re talking a month or two, fine. But if it’s a year or two, you need to be making a different game."

subsystem · 12 years ago
subsystem commented on Disqus cracked – Security flaw reveals users’ e-mail addresses   cornucopia-en.cornubot.se... · Posted by u/SuperChihuahua
dutchbrit · 12 years ago
Old olllddd news, almost every Wordpress blog uses Gravatar, same issue...
subsystem · 12 years ago
So why do you feel compelled to post when you know what you are saying is old news? You just adds to the noise and make it off putting for anyone else to post that actually knows about this event including your obvious point.
subsystem commented on Airbnb says this man does not exist. So I had coffee with him   pando.com/2013/12/08/airb... · Posted by u/antr
gamblor956 · 12 years ago
Somehow, every other VRBO site (such as VRBO.com) manages to comply with local laws. AirBnB doesn't--by choice. It refuses to comply with local laws because this lets it avoid the costs of compliance. Ultimately, AirBnB's competitive edge over its competitors is simply regulatory arbitrage.

This is why AirBnB generates such apathy. Take the regulatory arbitrage away and AirBnB isn't a technical startup or a market disruptor; it's just another VRBO site with pretty CSS.

They are overkill and would be the same as if Congress, at the federal level, had passed laws saying that YouTube-style services should be banned because they can facilitate copyright infringement that hurts others.

No, completely different, and as a lawyer you know this. Local issues are valid concerns for local laws. If New York wants to pass an across the board tax on temporary rentals, it is absolutely not the same as if Congress passed a country-wide ban on Youtube-style service.

The key to all this is to deal with the abuses while preserving the values conferred by the new services. If there is antipathy toward the wrongdoers, there is no basis for directing this to the innovators themselves.

Existing laws already do this. And as a business that injects itself into the market governed by such laws, AirBnB has taken on the burden of complying with such laws. Moreover, AirBnB isn't an innovator--it's a copycat. The only innovation AirBnB provided was sub-unit rentals (i.e., just a room or a couch), which is no longer the mainstay of its business.

subsystem · 12 years ago
From what I've seen I agree with your first point. AirBnB is more like The Pirate Bay than YouTube.
subsystem commented on PISA : Diligent Asia, indolent West    economist.com/blogs/graph... · Posted by u/ekm2
yapcguy · 12 years ago
The second link [2] is very interesting - it breaks down some past USA results by ethnicity.

It also seems the underlying PISA data is a closely guarded secret. No doubt a political tinderbox.

> Why does my second graph have to compare reading scores from 2009 to science scores from 2006 and math scores from 2003?

subsystem · 12 years ago
You can't really compare different data sets like that.
subsystem commented on Testimony of Ms. Soon Ok Lee (2002)   judiciary.senate.gov/hear... · Posted by u/mckee1
fennecfoxen · 12 years ago
Also, unlike North Korean camps, they generally don't send young children to Guantanamo Bay because of their parents' political crimes (real or invented).
subsystem commented on As engineers, we must consider the ethical implications of our work   theguardian.com/commentis... · Posted by u/jvoorhis
grellas · 12 years ago
Evil behavior, however precisely defined, has been and always will be with us. Technology enhances what we do as human beings and, hence, always has the potential to be applied to ill uses. If someone, then, takes what you develop and applies it for a purpose you never intended in creating it, that is an item beyond your control. Alan Turing - who applied his genius to confer what can only be called immeasurable benefits on society and who used his skills to crack Nazi codes to help end a terrible war - is not ethically responsible for the many consequences inevitably brought into the world simply because computing power can be used to magnify the effects of human evil. Was he (or is any other engineer whose technology is misused) a causal agent in the various bad outcomes we can identify in, for example, the enhanced lethality of weaponry or in the massive spying by governments on their citizenry? In a narrow sense, perhaps yes. When one traces things back up a causal chain, one can theoretically identify every individual actor who made technical innovations that culminated ultimately in a particular bad use of whatever type that afflicts us today. But, though a cause-in-fact, Mr. Turing (and the many engineers who followed him respecting any given facet of computing technology) is not what the lawyers call the "proximate cause" - that is, the immediately enabling agent - of the outcome. Meaning, if you deem it unethical to build bombs, then don't do work for a defense contractor helping to build bombs because your every innovation will be immediately applied to a use you deem unethical. The same for working for NSA in developing sophisticated spying technology. Or for whatever other ill use you can identify in society. But, beyond avoiding direct conduct by which you are proximately helping to cause an outcome you deem wrong, you as a technologist basically have no control over how your work may be applied by others and, as the collective results of such work eventually permeate society, your moral responsibility for the indirect results of your work effectively stand at zero. If the operative standard were otherwise, then all innovation would stand frozen altogether because it is always possible to conceive of an ill use for any technology that makes things faster, more powerful, more efficient, etc. Put any such thing into the hands of human actors and some bad results are guaranteed to follow given enough time and opportunity. Thus, unless one is to freeze all productive activity or is to go insane second-guessing how others might pervert that which is being done for good, engineers must perforce ignore tangential ethical implications over which they have no effective control.

I think it is fair to say that each of us in our given professions (mine being law) ought to avoid being a proximate cause of something deemed wrong even though technically legal (for example, I would not be a "mob lawyer" even though there are some technically very good lawyers who do that work). But even here that is an individual choice for each actor to make. For engineers, some may see it as a great opportunity to do advanced work in some of the social media companies while others may regard such companies as being engaged in unethical conduct as they at least sometimes use dubious techniques to try corral us as consumers into their tight little worlds. For any given engineer, working for such a company doing such things is a matter of conscience. Some may say yes, others no. The same is true in working for a defense contractor or for the government. Or for any other work that is legal but ethically suspect in the eyes of some but not others. It is your choice and it is your conscience.

The author of this piece reflects what I call the bane of associational thinking. He uses the royal "we" to define a group (here, engineers) and then prescribes very broad goals for what "we" "ought" to do. Since all the things described now stand as a matter of private choice over which the "we" group as a whole has no say, then the only way to translate this sort of thing into practical action is to form formal associations, assign things to committees, and then issue a series of prescriptions on what the group members ought to do. That may be fine in terms of the association giving rules that amount to exhortations to do good (who would disagree with that). But, beyond that, do you really want an organized association dictating what today stand as your private choices for your career? Or, worse, do you want such an association to lobby governments to adopt their strictures and give them the force of law? I would think not. How, then, can "we" do better? Of course, the question is always described as difficult and is left for further discussion precisely because it has no real answer apart from acting on individual conscience or apart from the potentially coercive ones of letting some association or government dictate your career choices and opportunities. Perhaps this sort of reasoning is justified as encouraging people to have a heightened conscience about what they do, and in that respect it is fine. But that is really as far as it goes before veering into unacceptable alternatives.

Our capacity to do wrong is innate to our nature, as is our capacity to do good. We should not stop trying to do good through our creativity just because others can take what we do and commit wrongs with it. Nor should we feel guilty about what we do as long as we in good conscience can say to ourselves that we are doing something productive and worthwhile and not directly causing harm to others. The "we" issue is in reality much more of an "I" issue and, for that, you should examine what you do carefully and strive for the good regardless of what others may do with it. If you want to exhort others to do better by your standards, then all the better. Just don't dictate to them on matters over which people in good conscience may disagree.

subsystem · 12 years ago
> He uses the royal "we" to define a group (here, engineers) and then prescribes very broad goals for what "we" "ought" to do.

That is pretty much the definition of ethics [0].

http://www.scu.edu/ethics/practicing/decision/whatisethics.h...

subsystem commented on Cyberlibertarians’ Digital Deletion of the Left   jacobinmag.com/2013/12/cy... · Posted by u/jboynyc
rayiner · 12 years ago
The author does not call people leftists or rightist, but rather associates language and causes with the left or right. He points out that cyber-libertarianism tend to use the terminology of the left ("freedom", "open"), while embracing the values of the right. Not every one on every issue, but as a pattern of behavior across many issues. The article points out that because of this language, people who otherwise identify with leftist causes end up supporting cyber-libertarian policies that further rightist causes.

I think the best thing in the article is about the different definitions of "freedom." Cyberlibertarians use "freedom" to mean "freedom from." This is quite incompatible with what many leftists,[1] might mean by "freedom," specifically the freedom of the masses to act collectively to shape their society.

[1] And indeed, many who would probably consider themselves "conservatives" in a pre-Tea Party world.

subsystem · 12 years ago
The concept of freedom can quickly become confusing, but I think the accepted terminology is that "freedom from" is negative freedom as in protections like universal healthcare. While "freedom to" is positive freedom as in expressions like absolute[-1] freedom of speech [0]. Where negative freedom is more left and positive freedom more right. You of course also have to account for the political y-axis (or similar concept [1]) i.e. the level of authoritarianism.

I think it's interesting that many of the concepts that the US prides itself on originally included negative freedom in a more prominent way than you see today [2][3][4].

[-1] Non-absolute freedom of speech can probably also be seen as a negative freedom.

[0] http://plato.stanford.edu/entries/liberty-positive-negative/...

[1] http://en.wikipedia.org/wiki/Political_spectrum#Other_multi-...

[2] http://en.wikipedia.org/wiki/James_Truslow_Adams#American_Dr...

[3] http://www.theguardian.com/politics/2001/jun/29/comment

[4] http://en.wikipedia.org/wiki/Four_Freedoms

subsystem commented on Google Puts Money on Robots, Using the Man Behind Android   nytimes.com/2013/12/04/te... · Posted by u/ipince
wpietri · 12 years ago
Just guessing, but for me it's about the audience. Software in the 70s and 80s was very nerdy. With the rise of the web, it got less so, driven by the need to reach consumers and the influence of print design. In the last decade, a lot of tech is downright chic, and succeeds because of that. Look at the iPod and the iPhone, for example.

I think robotics is coming up on a similar transition. For years it was 99% research projects and industrial uses: pure nerdery. But Bot & Dolly is selling to Hollywood, and is very slickly marketed. Aesthetics are starting to really matter.

subsystem · 12 years ago
I don't know. In terms of excitement and creativity I think we're slowly catching up to the end of the 90s.

u/subsystem

KarmaCake day3631August 2, 2012View Original