I can't wait to see more affordable RISC-V microcontrollers on the market!
The Kendryte K210 looked very cool, especially with its SIMD-ish "machine learning coprocessor", but it felt like they had rushed the hardware to market without investing in scrutable documentation or software support, last time I checked.
The GD32V series looks fantastic, since the current crop of GD32VF103 chips appear to be API-compatible with the venerable STM32F103 workhorse. But I haven't been able to find a source of the raw chips yet, it seems like you can only get them on development boards at the moment.
And there are always softcores running on FPGAs, but those sort of highlight how many permutations of "RISC-V" exist. I hope that we don't end up with too many inscrutable compiler flags to juggle as more of these chips become available.
Not completely API-compatible. They are /extremely/ similar. Giga Devices did make an ARM STM32 clone, so they probably just did a ctrl-c, ctrl-p on the peripherals. Addresses are slightly different, and things aren't quite the same. I just ran into an issue with the system timer and it's lack of documentation... But good news is that they are pin-for-pin compatible, and you can put a raw chip onto a blue-pill board and use it!
You can get raw chips from taobao. I used taobao and a reseller, superbuy to get mine. Not bad at all!
Huh, what do you search for on Taobao to find them? I tried "GD32VF103CB" a week or two ago, but I only got results for the GD32F103 ARM clones.
I guess it makes sense that the addresses aren't quite the same; iirc ST has some licensing restrictions on their SVD/header files saying that you can't use them with other vendors' chips anyways.
What's the appeal in using an STM32 clone? Are you building things in quantities where saving a few pennies is really worth the headache of all the subtle incompatibilities?
In lieu of the documentation, Kendryte has _a ton_ of code on their github, so you could use that if you're looking to work on something practical. For me, having a dual core risc-v, with _TPU_ and _DSP_ onboard for $8 in single digit quantities is absolutely nuts. I frankly didn't believe it and thought the spec sheet was fake, but I've since bought some real dev boards with this chip, and they work. Even if you don't need the TPU bits, just having dual-core risc-v is more than worth the price of admission.
I don't care about TPU, don't need DSP, but dual-core - yes, please do. Having a dedicated core for hard-real time and using the other for more complicated things is way better than single core. I've seen so many projects attempting to cover timing glitches from their single cores doing everything.
I don't understand why there aren't many multi-core offerings. It's not as if the silicon would cost a lot more, especially when it would be possible to downgrade the speed and/or size.
Seems like he has been changing companies quite a bit recently. Is this typical for the VP level? Does anyone know how his tenure at Google was viewed by others?
Well, his time at Tesla was basically (publicly, IIRC) determined to be a bad fit between him and the company, so can't really fault him there. So since his long tenure at Apple he had a misstep at Tesla and a shortish, but certainly reasonable, length at Google. Would definitely not call that job hopping.
His credentials put him in high demand. Due to this, he has the luxury of changing jobs in order to find his fit. Jumping can dampen the demand at some point, but it hasn't for now.
I don't think it's the absolute length of time, but rather the fact that the projects he was spearheading (Swift for Tensorflow/MLIR) are nowhere near production-ready (or at least they weren't the last time I checked).
2.5 years seems a very short amount of time to develop, launch and push adoption for low-level PL tooling (in an ecosystem that's not mature, but at least is reasonably developed). I would expect 5 years at a minimum.
I interpreted him going to Google as a place to learn a few new things while figuring out what to do next. He’s done some big stuff in the past. But even people like him need time to figure out what to tackle next.
Anyone who has been following his projects knows that 2.5 years is too little a time for the problems he set out to solve and the projects he was leading (esp. with TensorFlow, MLIR, TF runtime, TPUs). So this is really a very early, abrupt and surprising exit, and several people both inside/outside know that he ran into political issues/struggles with high-level folks. Fortunately, one of the projects (MLIR) is already part of LLVM and will remain unaffected, but looks like S4TF may just die out irrespective of what's being said. Otherwise, we may not hear much on the details given his "seemingly" reasonable length stint at Google.
He seems like the kind of developer that needs a real challenge to work on. When things shift into maintenance mode, he gets bored. That’s my take anyway. He’s a brilliant programmer. I’m glad to hear that he’s working on something really interesting.
I would avoid binning people into builders or maintainers, there are many more issues that could prompt one to leave Tesla and Google, eg: Work environment or ethical issues.
Swift for Tensorflow could never be taken seriously outside Apple community.
On Linux, Foundation barely works and one still needs to selectively do either import Darwin or import Glibc for basic IO stuff.
Then we are already at Swift 5.1, and Windows version has to be built from source will lots of caveats.
How can it even be taken seriously against Julia, Tensorflow for C++, ML.NET all of which work across macOS, Linux and Windows as of today, and offer the same strong typing benefits?
While I agree there is an even chance that Google will allow S4TF to fade away after Lattner's departure, that is more of a reflection on the company having no commitment or consistency to good ideas: ex: https://gcemetery.co
However S4TF should be taken seriously if you understand what they are trying to accomplish and how deeply they designed machine learning support into the language. Take a look at http://fast.ai new course offerings using S4TF. Swift has always been a long bet. If it doesn't work as you want it, is still short sighted to discount it in the future.
The Swift core team just added two new members, whose contributions are related to Swift on Windows, and Swift on the server respectively, and have specifically called out supporting Swift in non-Apple domains as a part of the roadmap for Swift 6 (including adding the autodiff work from S4TF into mainline Swift). There is attention being paid to these issues.
I love the idea of Swift for Tensorflow : powerful automatic differentiation and a solid type system in a single language.
That's something that is not seen elsewhere and that would make it a perfect fit to write deep learning code targeting production systems.
Now the language needs two things in order to be safe from an hypothetical abandon from Google :
- running smoothly on linux (I though it was already there but your post seem to imply that it is not the case)
- getting the auto-diff out of the alpha stage where people can build framework on top of it (fastai seem to be ready to jump on that ball which is nice)
sadly i agree with you. initially i was really excited about S4TF because swift is a fantastic language and it would be a fantastic replacement for python as the defacto ML language. but then i realized tensorflow is much worse than pytorch and the S4TF team was too small to build anything substantive enough to win market share.
S4TF was started over 2 years ago, and hasn't really gained any ground since then. It's a project on life support and Lattner's departure means the project is going to be put down soon.
It's good to try different things; S4TF tried, and failed. No one really cares about it at Google except the team developing it. The researchers are adopting other new technologies, like JAX.
Unfortunately Swift is a dud.... and the Apple ecosystem is stuck with it for the next few decades....
I have said, Swift is to Objective-C, what Scala is to Java. Sure, there are plenty of people that like Scala, but its 'academic stuffiness and complexity' doomed it to a niche language.
Same with Swift. It is doomed to be an apple ecosystem only type of language.
All I wanted is a Python* look alike, with some solid static typing, what we got was a franken/monster/language where people felt to try out their little academic pet-peeves, and sucking out the fun out of programing with it, and making it less accessible to beginners.
GO is becoming popular, not because it is shoved down the throat to people, but because its own merit, and mainly because they kept it simple. It is a closely to a "static Python and some minimal features" we got....
*I think Python is a great language, and very accessible for beginners, just not suitable for large projects due to its dynamic type system
I write iOS apps for a living. Most the people I know have moved away from ObjectiveC. When I see language boards, and jobs for iOS, I see Swift asked for over ObjectiveC. On Reddit, Swift has more subscribers than ObjectiveC, most of the tech articules I see these days are ObjectiveC, when I watch a video on WWDC I often see Swift first.
Swift may not be what you wanted, but it is a long way from being a dud. Swift didn't have to be great, it just had to be better than ObjectiveC.
I'm not sure this argument makes sense. I'm not primarily a Swift developer. My goto languages are Python, C++, Java and Javascript (I build DevOps automation pipelines and monitoring tools -- primarily for scientific computing). But my experience with Swift on MacOs and iOS has been pleasant. I definitely would not use it for cross-platform development. But that is not because the language isn't nice. It's because it won't have widespread support outside the MacOS ecosystem. But for UI apps on MacOS and iOS, it is the best choice. Similarly, I use C# for Windows UI apps, but would not use it for anything else. Use the right tool for the job.
Because of some technical argument, or just personal aesthetics?
>I have said, Swift is to Objective-C, what Scala is to Java. Sure, there are plenty of people that like Scala, but its 'academic stuffiness and complexity' doomed it to a niche language.
While Objective-C was really cool, it was neither modern enough, and few people liked it (mostly old NeXT/early OS X guys, but not most of the iOS crowds).
And Swift is easy to use and nothing like Scala in academic-ness and complexity.
>All I wanted is a Python look alike, with some solid static typing*
That's not what the ecosystem needed, or what people in general want (there's Go for that, for one). Swift is somewhere between Rust and Kotlin, features wise.
Have you checked Nim? It's the closest thing to static Python that I know of (plus some extras like Lisp inspired macros), and there is some early development of a machine learning ecosystem (outside of wrappers):
When I read that SiFive is working on custom silicon, my first thought was to wonder what would happen if custom hardware and custom programming languages co-evolved together, instead of languages adapting to old hardware that's adapted to older programming languages.
.. and here he is talking about compilers. I might have to keep an eye on SiFive in addition to Oxide.
There are some other people talking about him not staying long at places. In this talk he mentions how he intended to stay at UIUC for one year and got 'nerd-sniped' into staying for 5 years building LLVM. After an experience like that, I could see how someone might feel claustrophobic and tend to take any opportunity on offer - if it's interesting enough.
I wonder if this will curtail the effort to implement TensorFlow in Swift, turtles all the way down?
That would be a shame. Python ecosystem with TensorFlow, PyTorch, mxnet, etc. has been good for rapid progress but I think we need something better to break out of just using deep learning. This needs a hackable infrastructure. I personally don't have the skill to hack the C++ TensorFlow core.
I think a new ecosystem based on Swift, TensorFlow, and future tools and platforms makes some good sense.
An alternative would be a similar hackable infrastructure based around the Julia language, which is also very good.
Even in swift it isn't "turtles all the way down".
The AD stuff is hardcored into the C++ guts of the compiler, whereas Julia's source to source autodiff accesses a compiler pass from a fully Julia user package.
Aside from making it easier to hack and improve the AD system as just a Julia user, this capability enables other package program transforms like that in https://github.com/MikeInnes/Poirot.jl for prob programming.
So Julia is already further ahead in that regard and it's more hackable.
re: "So Julia is already further ahead in that regard and it's more hackable."
I agree. Flux is very concise, very nice to work with. I just had some trouble with my small playing-around code snippets when going from one minor release to the next, but that probably means I should revert to the LTS 1.* version.
I have tried Julia with non-mathematical stuff like using it with sqlite, fetching and using RDF data, and general text processing - nice for those use cases also.
He went from Tesla to Google in no time flat, less than 3 months. But he was at Google from August 2017 until now which is significantly more than one ISO no time flat unit.
The Kendryte K210 looked very cool, especially with its SIMD-ish "machine learning coprocessor", but it felt like they had rushed the hardware to market without investing in scrutable documentation or software support, last time I checked.
The GD32V series looks fantastic, since the current crop of GD32VF103 chips appear to be API-compatible with the venerable STM32F103 workhorse. But I haven't been able to find a source of the raw chips yet, it seems like you can only get them on development boards at the moment.
And there are always softcores running on FPGAs, but those sort of highlight how many permutations of "RISC-V" exist. I hope that we don't end up with too many inscrutable compiler flags to juggle as more of these chips become available.
You can get raw chips from taobao. I used taobao and a reseller, superbuy to get mine. Not bad at all!
I guess it makes sense that the addresses aren't quite the same; iirc ST has some licensing restrictions on their SVD/header files saying that you can't use them with other vendors' chips anyways.
Thanks for the extra information!
I don't understand why there aren't many multi-core offerings. It's not as if the silicon would cost a lot more, especially when it would be possible to downgrade the speed and/or size.
How so?
Edit: Jesus it was a legitimate question, why the downvotes?
I wouldn't read anything else into it.
2.5 years seems a very short amount of time to develop, launch and push adoption for low-level PL tooling (in an ecosystem that's not mature, but at least is reasonably developed). I would expect 5 years at a minimum.
I'm assuming they'll just die on the vine now?
Swift for Tensorflow could never be taken seriously outside Apple community.
On Linux, Foundation barely works and one still needs to selectively do either import Darwin or import Glibc for basic IO stuff.
Then we are already at Swift 5.1, and Windows version has to be built from source will lots of caveats.
How can it even be taken seriously against Julia, Tensorflow for C++, ML.NET all of which work across macOS, Linux and Windows as of today, and offer the same strong typing benefits?
However S4TF should be taken seriously if you understand what they are trying to accomplish and how deeply they designed machine learning support into the language. Take a look at http://fast.ai new course offerings using S4TF. Swift has always been a long bet. If it doesn't work as you want it, is still short sighted to discount it in the future.
also: https://twitter.com/JokerEph/status/1221831507351748608
Now the language needs two things in order to be safe from an hypothetical abandon from Google : - running smoothly on linux (I though it was already there but your post seem to imply that it is not the case) - getting the auto-diff out of the alpha stage where people can build framework on top of it (fastai seem to be ready to jump on that ball which is nice)
https://en.m.wikipedia.org/wiki/David_Abrahams_(computer_pro...
That would be a weird move if they were looking to outright drop it.
[0]https://twitter.com/DaveAbrahams/status/1207690883782467584 [1]https://en.wikipedia.org/wiki/David_Abrahams_(computer_progr...
It's good to try different things; S4TF tried, and failed. No one really cares about it at Google except the team developing it. The researchers are adopting other new technologies, like JAX.
I have said, Swift is to Objective-C, what Scala is to Java. Sure, there are plenty of people that like Scala, but its 'academic stuffiness and complexity' doomed it to a niche language.
Same with Swift. It is doomed to be an apple ecosystem only type of language.
All I wanted is a Python* look alike, with some solid static typing, what we got was a franken/monster/language where people felt to try out their little academic pet-peeves, and sucking out the fun out of programing with it, and making it less accessible to beginners.
GO is becoming popular, not because it is shoved down the throat to people, but because its own merit, and mainly because they kept it simple. It is a closely to a "static Python and some minimal features" we got....
*I think Python is a great language, and very accessible for beginners, just not suitable for large projects due to its dynamic type system
Swift may not be what you wanted, but it is a long way from being a dud. Swift didn't have to be great, it just had to be better than ObjectiveC.
Because of some technical argument, or just personal aesthetics?
>I have said, Swift is to Objective-C, what Scala is to Java. Sure, there are plenty of people that like Scala, but its 'academic stuffiness and complexity' doomed it to a niche language.
While Objective-C was really cool, it was neither modern enough, and few people liked it (mostly old NeXT/early OS X guys, but not most of the iOS crowds).
And Swift is easy to use and nothing like Scala in academic-ness and complexity.
>All I wanted is a Python look alike, with some solid static typing*
That's not what the ecosystem needed, or what people in general want (there's Go for that, for one). Swift is somewhere between Rust and Kotlin, features wise.
https://nim-lang.org/
https://github.com/mratsim/Arraymancer
[1] https://m.youtube.com/watch?v=yCd3CzGSte8
.. and here he is talking about compilers. I might have to keep an eye on SiFive in addition to Oxide.
There are some other people talking about him not staying long at places. In this talk he mentions how he intended to stay at UIUC for one year and got 'nerd-sniped' into staying for 5 years building LLVM. After an experience like that, I could see how someone might feel claustrophobic and tend to take any opportunity on offer - if it's interesting enough.
That would be a shame. Python ecosystem with TensorFlow, PyTorch, mxnet, etc. has been good for rapid progress but I think we need something better to break out of just using deep learning. This needs a hackable infrastructure. I personally don't have the skill to hack the C++ TensorFlow core.
I think a new ecosystem based on Swift, TensorFlow, and future tools and platforms makes some good sense.
An alternative would be a similar hackable infrastructure based around the Julia language, which is also very good.
The AD stuff is hardcored into the C++ guts of the compiler, whereas Julia's source to source autodiff accesses a compiler pass from a fully Julia user package.
Aside from making it easier to hack and improve the AD system as just a Julia user, this capability enables other package program transforms like that in https://github.com/MikeInnes/Poirot.jl for prob programming.
So Julia is already further ahead in that regard and it's more hackable.
I agree. Flux is very concise, very nice to work with. I just had some trouble with my small playing-around code snippets when going from one minor release to the next, but that probably means I should revert to the LTS 1.* version.
I have tried Julia with non-mathematical stuff like using it with sqlite, fetching and using RDF data, and general text processing - nice for those use cases also.
(via https://news.ycombinator.com/item?id=22160226)