- As a formal test suite in the program's own language?
- Or using a .md natural language "tests" collection that must pass, which an LLM can understand?
To answer the OP, I learned use different models for reasoning vs. coding. - As a formal test suite in the program's own language?
- Or using a .md natural language "tests" collection that must pass, which an LLM can understand?
To answer the OP, I learned use different models for reasoning vs. coding.Otherwise, the menu bar makes the contents not truly 16:9 or awkwardly shows distractions/leakage in the form of the menu bar.
Critically, if contents being presented is true pixel perfect 16:9 ratio and the window is not "fullscreen" or "maximized with hidden menubar", then the aspect ratio gets messed up, ruining what is otherwise perfection through a slice of "missing" content along one edge.
Indeed, more for presenting than actual task work IME.
Nice thing is this clock can show on other displays, too, in multi monitor setups, e.g. when auto-hide-menu is enabled. So sweet.
The only problem with "automatically hide and show menu bar" was that the clock was not there.
Which is actually very nicely solved now.
Well done!
I would note there are some known health hazards in handling thermal-paper receipts(BPA/BPS)[1] with your bare hands if you do so often. I don't know much beyond this, I would look into it.
[1] https://www.pca.state.mn.us/business-with-us/bpa-and-bps-in-...
Secondly, IME thermal print can fade to nothing after 1-10 years. So these are specifically for short-ish-term use. Not for labeling something that is supposed to last a long time.
1) 3x5 cards printed on a printer dedicated to this task 2) Command line routine where I can: a) Enter tasks b) Be able to update card by putting in the card number assigned to the task (which also includes a date). c) Be able to reprint a card if needed d) Be able to view the card on the screen (obviously).
Written in bash.
This is not to be clear to do things that are procrastination but rather to be able to keep track various things that I want to get done the next day or other info that I want physically able to view on my physical desktop during the day.
(I hate to handwrite and can type very well so...)
The 3x5 card printout will contact a checkbox where you can just ink check any item.
The routine makes sure that you only type in the correct number of characters per line so it doesn't wrap.
I then modified this to be able to use larger index cards.
Index cards lay flat on the desk (as opposed to a receipt printer).
Important to have a dedicated printer for this taks otherwise to much friction changing paper.
IME most printers struggle with printing thicker cardstock, or non-normal sizes, e.g. trouble with keeping that size paper straight or with bending/feeding.
Is it just a normal-sized printer, or are there special index-card printers?
Apple is ruthlessly focused on extracting maximum current and long-term value from its high-end operating system which appeals to creative professionals, software developers, rich people, regular home users, and both the tech savvy and not very tech savvy.
In order to support its operating system experience, Apple has to make hardwarre.
Yes, Apple has a quite large ecosystem, but they do things slowly, intentionally, and by the book.
You know what's really not cool that the other companies all looked the other way on and are too big to fail on? Massive, massive copyright infringement. I suspect that Apple's legal team made the ethical choice to not do that.
Apple knows that models and inference are a race to the bottom, since the pirated content is broadly available, and since the model building is pretty openly published, and since there are several players already, and the hardware is basically commodity.
Apple can still be the glue that offers that platform on which people develop and make use of AI related technology.
The OP's assertion seems to miss that, for local AI, if one wants to spend under $10K or under $20K on an inference setup, or even under $1000 for that matter, (arguably) by far the the simplest and most efficient tokens per watt or tokens per dollar will come from buying Apple machines.
Apple appears to be simultaneously:
- avoiding truly silly levels of hype train buy-in
- letting market go where it wants to
- supporting that with AI where it make sense for them (on-device for user privacy and easy to use where applicable; connecting users to interfaces to other companies where that makes sense)
To draw an analogy, Apple could make a Kagi/Google, or a GSheets/Excel, but they're not focused on those kind of things. Engineering stuff is expensive. Apple apparently runs a very tight ship.Honestly, I think AI has little place in an OS where the user is well acquainted with it and in full control. Many may disagree, and that's fine. But maybe Apple is willing to let those people use another operating system?
* Disclaimer, the above are just my thoughts, I do not have any special knowledge.
[0] https://adventofcomputing.libsyn.com/website/episode-78-intercal-and-esoterica
[1] https://adventofcomputing.libsyn.com/website/episode-158-intercal-rides-again-restoring-a-lost-compiler
[2] https://adventofcomputing.com
I somewhat regret my expensive switch from Linux to MacOS. MacOS is just so weird, it doesn't make any sense to me. For the first time in my life I feel like some tech-illiterate grandpa trying to figure out how to make his blasted computer do stuff.
When picking up macOS, two things really help:
1. Having some macOS techies in your circle (co-workers or friends) to whom you can fearlessly ask random newbie questions, since there's a good chance there's a way that works well, which you are not discovering, and one or more people in your Mac User Friends group will have a good suggestion. (Maybe an LLM or Reddit can solve this, but real people are good, too.)
2. Leaning into whatever the macOS way to do the thing is. Don't try to do it the Windows or Linux way. Fall into the Apple paradigm. Don't fight it.