Readit News logoReadit News
nappy commented on Learning Tibetan changed the way I think (2023)   lionsroar.com/learning-ti... · Posted by u/whereistimbo
BobaFloutist · 9 months ago
It's also worth mentioning that although English doesn't explicitly require you do do any of this, we generally have ways of conveying respect/familiarity (by tweaking the formality of our register).
nappy · 9 months ago
English grammar also has features for expressing uncertainty, unreality, hypothetical, wishes, demands with uncertain outcomes etc. by use of the subjunctive mood.

"If I were a bird, I would be able to fly." (were, not was) "God bless you." (bless, not blesses) "The teacher demands that students be on time." (be, not are)

Though many native speakers, even very intelligent ones, fail to properly use subjective mood at a high rate. Or otherwise do not recognize it. As some of the other comments note, there are some interesting differences around what a languages grammar will strictly enforce, where as in English, proper use of the subjunctive mood is less strict. Obviously, this is also far less expressiveness in English around this in grammar than there is in other languages.

https://en.wikipedia.org/wiki/Subjunctive_mood#Englishhttps://en.wikipedia.org/wiki/English_subjunctivehttps://en.wikipedia.org/wiki/Grammatical_mood

nappy commented on Daylight Computer – New 60fps e-paper tablet   daylightcomputer.com/prod... · Posted by u/asadm
boochiboo12 · a year ago
Hi yall, founder of Daylight here.

Happy to answer any questions you have. Long time lurker, so this is pretty cool to finally take part :)

I made this because I wanted the eye-strain free and minimalist qualities of my kindle/Eink applied to so much more of what I do on a computer.

Lack of speed and ghosting felt like it made traditional Eink impossible to do most computing tasks. So we focused on making the most Paperlike epaper display that has no ghosting and high refresh rate - 60 to 120fps. We started working on this in 2018.

We developed our own custom epaper display tech we call LivePaper. We focused on solving the tradeoffs RLCDs traditionally have - around reflectance %, metallic-look / not Paperlike enough, viewing angle, white state, rainbow mura, parallax, resolution, size, lack of quality backlight, etc.

First proof of concept in late 2021, and then it took us 2.5 years to get it into production.

And we built a whole android tablet around it.

It’s essentially our attempt at making a remarkable tablet on steroids / kindle on steroids. Definitely some trade offs, but on the whole we think it’s worth it. (& on twitter a bunch of early customers seem to think so too)

Note: it’s 60fps epaper, not off the shelf Eink. We spent years developing what we think is the best epaper display in the world and it’s exclusively manufactured by our display factory in Japan.

There’s still many cases where traditional Eink is going to be better (bistability, viewing angle, white state color, etc), but we feel for more general purpose computers you can code on and do google docs on and do fast multitouch amongst a thousand other things, the speed and lack of ghosting totally makes it worth it.

Think of it as a Godzilla sized pebble watch with a decade of improvement

Or think of it as a gameboy advanced, advanced

nappy · a year ago
I'm glad to see so much interest in devices like this. I hope they succeed. I own an Onyx Boox Tab Ultra, which I've been enjoying tremendously, and specs seem to compare favorably to this.

https://shop.boox.com/products/tab

Doesn't seem to be for sale at the moment, but the version with a color epaper layer is: https://www.amazon.com/BOOX-Tab-Ultra-Pro-Digital/dp/B0CHM54...

Do you have more details on your display technology and how it performs? Any demo videos? It sounds like this is the key selling point?

I'd also be curious to see more on the software side of this product - stock android isn't perfect for an eiknk display.

nappy commented on A brief history of computers   lesswrong.com/posts/vfRpz... · Posted by u/zdw
dreamcompiler · 2 years ago
> [In the 1980s] Microprocessors started to replace integrated circuits.

Author implies (in the quote above which occurs after the discussion of the invention of personal computers) that the early personal computers from the 1970s did not use microprocessors. This of course is false: All the early "personal computers" used microprocessors. For example the IMSAI used an 8080, the Apple II used a 6502, and the TRS-80 used a Z80. Microprocessors -- which were never intended to be the basis for entire general-purpose computers -- were repurposed for exactly that application by visionaries like Woz. Microprocessors made personal computers possible.

It would be more correct to state that in the 1980s microprocessors began to replace integrated circuits in mini and larger computers.

A subtle related point is that it would be even better to point out that by "integrated circuit" above the author really means discrete small-scale integrated circuit. All microprocessors are integrated circuits, but not all integrated circuits are microprocessors. Microprocessors are large-scale integrated (LSI) circuits or nowadays very large-scale integrated (VLSI) circuits.

nappy · 2 years ago
There is a lot that is wrong in this article. Broad overviews are useful to people new to a topic - this would only mislead and confuse people.
nappy commented on A brief history of computers   lesswrong.com/posts/vfRpz... · Posted by u/zdw
kalverra · 2 years ago
Not specifically computers, but if you want a very deep dive into the creation of the internet (including some bits about the earliest computers) The Dream Machine is a great, and extensive look at the history of the internet through the lens of J.C.R. Licklider's life. It was rather mind blowing to me in various ways, one of the big ones being that it seems a lot of early computer pioneers weren't only mathematicians and physicists, but also psychologists.
nappy · 2 years ago
Agreed. It's an excellent book. But perhaps a little long if you are purely interested in computer history and want an introduction in a shorter volume. I recommend these two: https://en.wikipedia.org/wiki/The_Idea_Factoryhttps://www.amazon.com/Dealers-Lightning-Xerox-PARC-Computer...
nappy commented on A brief history of computers   lesswrong.com/posts/vfRpz... · Posted by u/zdw
citelao · 2 years ago
What would be a good, more academic overview of computing history? Do you have any specific book recommendations? I'd love to read a more "citation-based" version.
nappy · 2 years ago
Not sure about academic history, but in a single volume, this does a good job on early 20th century computer history: https://en.wikipedia.org/wiki/The_Idea_Factory
nappy commented on A brief history of computers   lesswrong.com/posts/vfRpz... · Posted by u/zdw
nappy · 2 years ago
I don't recommend reading this. There are many gaps and a lot of important history missing, including:

1. Computation before ~1800. Abacus, Napier's Bones, Slides rules, Pascal's Calculator, motivations from celestial navigation and astronomy.

2. Modern analog computers ~1900-1950. The author seems to refer to them as "math machines" and leaves it at that, without exploring much deeper than that they were used for besides calculating firing solutions for artillery. I think the author lacks a solid grasp of how mathematical tables were used from 1614 onwards, and that analog computers were used to create much more accurate and complex tables which could be used for more accurate firing solutions. And for other purposes as well, beyond code-breaking.

>"It's hard for me to wrap my head around the fact that early, pre-general purpose computers (~1890-1950s) weren't computers in the way that we think about computers today. I prefer to think about them as "math machines"."

>"But subsequent machines were able to do math. From what I'm seeing, it sounds like a lot of it was military use. A ton of code-breaking efforts during World War II. Also a bunch of projectile calculations for artillery fire."

3. Poor description of the advent of electronic computers.

>"Then in the 1940s, there was a breakthrough.[10] The vacuum tube took computers from being mechanical to being electric. In doing so, they made computers cheaper, quieter, more reliable, more energy efficient, about 10-100x smaller, and about 100x faster. They enabled computations to be done in seconds rather than hours or days. It was big."

It was certainly a breakthrough, but the idea that computers immediately became quieter, cheaper, and more reliable is false. They were much larger, initially, compared to analog computers of the era. By almost any measure, they were also much less efficient with energy, though this may depend on what sort of calculations you are doing - I'm less sure of this.

4. Incomplete and incorrect descriptions of programming languages and the history of digital logic. No mention of information theory and Claude Shannon, digital circuits.

This is a poor analogy that misleads a reader who is unfamiliar with programming languages, it obscures the abstraction:

>"Think of it like this. It's translating between two languages. Assembly is one language and looks like this: LOAD R1, #10. Machine code is another language and looks like this: 10010010110101010011110101000010101000100101. Just like how English and Spanish are two different languages."

5. Lack of understanding of digital hardware.

The author never describes why or how vacuum tubes and then transistors allowed computers to use logic that is both digital and electronic.

The author jumbles a lot of ideas into one and does not seem to understand the relationship and distinction between the evolution of transistor technology (point-contact -> BJT -> FET -> MOSFET) and the creation of integrated circuits.

>"Before 1966, transistors were a thing, but they weren't the transistors that we imagine today. Today we think of transistors as tiny little things on computer chips that are so small you can't even see them. But before 1966, transistors were much larger. Macroscopic. Millimeters long. I don't really understand the scientific or engineering breakthroughs that allowed this to happen, but something called photolithography allowed them to actually manufacture the transistors directly on the computer chips."

6. Lack of historical context. No mention of the motivations for creating the vacuum tube or transistor: amplification and switching for use in telegraph and phone networks. No mention of the role the US government played beyond the 1860 Census, no mention of continued investments motivated by the Cold War, Apollo Program, ICBMs, etc. They briefly cover artillery firing solutions and mention code-breaking.

7. Over reliance on LLMs to research and write this.

Hard to take a history which includes this seriously:

>"And from what ChatGPT tells me, it's likely that this would have been an investment with a positive ROI. It'd make the construction of mathematical tables significantly faster and more reliable, and there was a big demand for such tables. It makes sense to me that it'd be a worthwhile investment. After all, they were already employing similar numbers of people to construct the tables by hand."

>"Anyway, all of this goodness lead to things really picking up pace. I'm allowed to quote Claude, right?"

nappy commented on Developer tools to create spatial experiences for Apple Vision Pro   apple.com/newsroom/2023/0... · Posted by u/todsacerdoti
leejoramo · 2 years ago
> a little bit of survivor ship bias at play

I am not sure why you are focusing on "survivor bias". My comment was about being a programmer and the fun of seeing a new technology that is an undefined territory to explore.

But I did directly mention that many of these do not workout as as I referenced: amiga, os/2, palm, and VisiCalc and Lotus 123

However, so that you may have some idea of my personal "survivor bias", here is a super simplified multi-dimension history of my over 43 years of computing smashed into a linear list with massive things forgotten:

  TRS-80 Model III, LDOS, BASIC, VisiCalc, 
  modem-to-modem
  TRS-80 Model 100
  CompuServe
  MS-DOS compatible, MultiPlan, TurboPascal, dBase
  Windows 3.0
  DesqView  
  Delphi Online 
  OS/2, REXX
  Macintosh System 7 --> 1993 to OS X
  Dial-up ISP
  Solaris, Perl
  BBEdit
  Windows 95
  Palm Pilot 
  Linux for Servers --> 1998 to Today
  Linux Desktop
  ColdFusion
  Apache 
  Python --> 1999 to Today
  Always on Internet (ISDN, DSL, Cable) --> 2004 to Today
  Nokia cellphones
  PHP
  Zope/Plone
  Mac OS X --> BETA to Today
  Windows XP
  Nginx
  iPhone --> 2009 to Today
  SublimeText
  Windows 10 --> 2018 to Today
  VS Code --> 2018 to Today
  NodeJS --> 2018 to Today
  SvelteKit --> 2021 to Today
  KDE Neon --> 2022 to Today (the year of Desktop Linux arrived for me)
  
The longest thread is from 1993 to Today is what is now known as macOS, but for about the first 15 years of my usage Apple was considered at risk of failure. MS Windows was always a part of my world, but seldom my primary focus, it has only been since 2018 that I use it on an extended daily basis. Some of the above are clearly dead. Some I still use lightly (Nginx, BBEdit) but are no longer the focal point of my work.

As far as Apple's Vision Pro, I have no clue if it will be successful. For myself personally, it is the first headset I am even interested in playing with.

nappy · 2 years ago
seems like the longest thread is Windows by the same reasoning?
nappy commented on Apple Vision Pro: Apple’s first spatial computer   apple.com/newsroom/2023/0... · Posted by u/samwillis
nappy · 2 years ago
> "starting at $3,499"

I wonder what the model that you actually want to buy will cost and what average sales price will be.

From the looks of it, I wouldn't be surprised if they sell a "pro" headband like Meta does for the Quest that has a battery pack that does better than the 2 hours of charge with the brick.

nappy commented on Don Knuth plays with ChatGPT   cs.stanford.edu/~knuth/ch... · Posted by u/talonx
nappy · 2 years ago
why does Knuth think Trump eats Betel nuts? Does he?

u/nappy

KarmaCake day699July 25, 2015View Original