Readit News logoReadit News
genop commented on uLisp – Lisp for the Arduino   ulisp.com/... · Posted by u/weeber
lisper · 9 years ago
I have TinyScheme running on a STM32F415.

https://sc4.us/hsm/

genop · 9 years ago
Any BSD kernel drivers for this?
genop commented on SSRN sold to Elsevier   professorbainbridge.com/p... · Posted by u/kristianc
genop · 9 years ago
Do authors assign copyright to SSRN? (If SSRN is not trying to turn a profit why would they need an assignment?)

Will Elsevier require copyright assignment?

genop commented on UC students suit claims Google scanned accounts without permission   santacruzsentinel.com/gen... · Posted by u/Jerry2
genop · 9 years ago
In the previous Gmail litigation a couple of years ago Google Apps for Education users were one of the classes.

In that case Koh dismissed the 1967 California Invasion of Privacy Act eavesdropping claims because users' lawyers could not show the emails were confidential (see Section 632). However she didn't dismiss the wiretapping/interception claims (see Section 631).

Another element required for an eavesdropping claim is lack of consent.

How are you going to show that the granting of consent was in question for each and every user and was thus a "common question"? Maybe some users read the terms and other did not. Maybe some users paid attention to news reports about email scanning and others did not. See Koh's Opinion.

For those interested search "paulhastings.com" "in re google". There is a copy of the unreported Opinion, with some interesting redactions on the devices Google uses to scan email. Content Onebox, Medley Server, Changeling, CAT2 Mixer, ICEbox Server, etc.

What if users started putting "CONFIDENTIAL", as between sender and recipient, at the the top of all their emails? From there one could argue confidentiality implies lack of consent to eavesdropping.

Apparently Google argued users understand that interceptions are part of how all email, not just Gmail, is transmitted. Intermediary SMTP servers might be "interception" but it seems like a reasonable judge could conclude if a message intercepted says "CONFIDENTIAL" then there's no permission to read it, whether the reading is done by a human or a machine programmed by a human.

genop commented on A fundamental introduction to x86 assembly programming   nayuki.io/page/a-fundamen... · Posted by u/nkurz
userbinator · 9 years ago
For "How does the computer really work?" questions, I recommend this book:

http://www.charlespetzold.com/code/

genop · 9 years ago
Funny you mention that one. I have been re-reading that book the past few weeks.

This book takes a bit of a different angle on explaining computers. I really enjoy the history.

My only complaint would be that I think he has a Microsoft bias (but then I guess I am biased myself).

And similarly, I have a minor annoyance with the OP's mention of Linux, and only Linux, when he touches on calling conventions. This bias toward one system, and ignorance of others, is typical of many websites and documentation. To be fair, the OP mostly avoids it.

To be clear, great explanations of computers to me are ones that either:

   1) take great care to stay completely neutral and only discuss universally shared traits across systems,

   2) go to great lengths to try to be as comprehensive as possible, including many systems and all their commonalities and idiosyncracies, or

   3) focus only on one system and go into great detail how it works.
The more the author strays from 1, 2 or 3, the less likely I am to read their work.

Petzold pays ample attention to Morse code and similar succinct ways of communicating information. In my opinion this type of focus is the mark of a skilled coder. When I look at the entries to IOCC, it is no surprise to me that Morse code is (or at least was) a frequent focus of the entrants.

genop commented on Are US Courts Going Dark?   justsecurity.org/30920/co... · Posted by u/hackuser
genop · 9 years ago
Counter example: Volkswagen diesel case, Judge Charles Breyer

All litigation should follow this model. No delays, and transparent to the public. No PACER account needed.

genop commented on Programming by poking: why MIT stopped teaching SICP   posteriorscience.net/?p=2... · Posted by u/brunoc
lisper · 9 years ago
> even 4 whole years isn't enough to learn "all the fundamentals"

No. You are missing the point. The fundamentals are very simple and easy to learn. That's what makes them "fundamental." It's all the random shit invented by people who didn't understand the fundamentals (or chose not to apply them) that takes a long time to learn.

genop · 9 years ago
Amen.

How could they ever understand "simple and easy"? Their concept of simple is not based in reality.

There seems to be this idea (I wish it had a name) that one can just apply a new abstraction layer and the lower layers magically disappear or are rendered insignifcant.

And of course producing 100's or 1000's of pages of "documentation". This is disrespectful toward the reader's time. The best documentation is and will always be the source code. If I cannot read that, then for my purposes there is no documentation.

This is not to say some of these old new thing higher-level of abstraction solutions do not work well or are not easy to use or do not look great. Many people love them, I'm sure. But until I see more common sense being applied it is just not worth my time to learn. I would rather focus on more basic skills that I know will always be useful.

genop commented on Programming by poking: why MIT stopped teaching SICP   posteriorscience.net/?p=2... · Posted by u/brunoc
jdmoreira · 9 years ago
Reading this made me so so sad, I do agree with the reasoning.

I learned to program on a course that follows SICP, I spent all my college years learning how to program from first principles, building all the pieces from scratch. Building compilers, soft threads implementations, graph parsing algorithms... and I was happy with that way of programming!

Today I'm an iOS developer, I spend most of my day 'poking' at the Cocoa Touch framework whose source I can't even read because it's closed source. The startup I work for moves so fast that I'm forced to use other peoples open source projects without even having the time to read the source. I feel miserable doing the kind of programming I do nowadays! I wish I could go back to simpler times, where I actually had to reason about algorithms, data structures, hardware, etc...

genop · 9 years ago
Serious question: What is stopping you from going back?

I'm naive but curious and I do not want to make guesses.

I wonder how are we going to preserve knowledge about programming from first principles if, under pressure from corporations and lazy peers, no one does it anymore?

genop commented on Google Deactivates Web Search API   ajax.googleapis.com/ajax/... · Posted by u/woodruffw
genop · 9 years ago
The YouTube API used to be open too - no regstration or authentication. That ended years ago. I don't really need an "API" to do searches and transform the results into a more useful format than HTML. These API's make it easier and are addictive, for sure. It's foolish to rely on them however.
genop commented on To become a good C programmer (2011)   fabiensanglard.net/c/... · Posted by u/__john
umanwizard · 9 years ago
I see. So you are excluding the preprocessor, linker, type-checker (for some languages this is a separate program), assembler, etc.?
genop · 9 years ago
Yeah. I'm sorry about the terminology. I guess I just like the term "code generator".

When I use that term I envision simple filters that take ASCII input, maybe even some sort of "template", and transform it into some other format that's useful. Ideally, asm. But not always.

For example, in GCC, for x86, there's a couple of programs that operate on i386-opc.tbl and i386-reg.tbl. I would not call them "code generators" but I suspect they are needed in order for "gcc -s" to work.

genop commented on Waybackpack: download the entire Wayback Machine archive for a given URL   github.com/jsvine/wayback... · Posted by u/ingve
genop · 9 years ago
Internet Archive has an HTTP header called "X-Archive-Wayback-Perf:"

I can guess what it means but maybe someone here has some insight?

It certainly looks like their Tengine (nginx) servers are configured to expect pipelined requests. It has no problem with greater than 100 requests at a time. See HTTP header above.

Downloading each snapshot one at a time, i.e., many connections, one after the other, perhaps each triggering a TIME_WAIT and consuming resources, may not be the most sensible or considerate approach. If just requesting the history of a single URL, maybe pipelined requests over a single connection is more efficient? I'm biased and I could be wrong.

However their robots.txt says "Please crawl our files." I would guess that crawlers use pipelining and minimize the number of open connections.

I have had my own "wayback downloader" for a number of years, written in shell script, openssl and sed. It's fast.

IA is one of the best sites on the www. Have fun.

u/genop

KarmaCake day14May 1, 2016View Original