Readit News logoReadit News
TTPrograms commented on Linear transformers are faster after all   manifestai.com/blogposts/... · Posted by u/JoelEinbinder
SmartestUnknown · 2 years ago
This is not a new algorithm. The same algorithm is described in Figure 4 (Theorem 3.1) of https://arxiv.org/pdf/2310.01655.pdf

(Disclaimer: I am an author on the linked paper)

TTPrograms · 2 years ago
I don't think the posted algorithm is particularly novel, but the algorithm you cite is deeply different.

Also I note the only thing you have posted before is a link to this paper in particular.

TTPrograms commented on Histogram vs. ECDF   brooker.co.za/blog/2022/0... · Posted by u/r4um
TTPrograms · 4 years ago
I think there's an issue with the histogram rendering in this post. The rapid descent from the spike on the left is not consistent with high ECDF impact and the apparent binning resolution visible in the piecewise line-segments. In general histograms should not be visualized with connected line-graphs in this way - the standard bar graph depiction makes the bin-width apparent and resolves some of the issues the article needs the ECDF for (e.g. relative impact can be assessed visually by comparing the relative areas of the associated bars). The bar visualization also makes it possible to use varying bin sizes, which is extremely useful with any distribution that has tails.
TTPrograms commented on Airflow's Problem   stkbailey.substack.com/p/... · Posted by u/cloakedarbiter
mywittyname · 4 years ago
Managed Airflow doesn't even solve any of the author's outlined frustrations. It keeps the "obscene" syntax, it's still stateless, it's not "decentralized" etc.

Honestly, the article is so disingenuous that it comes off like a paid-for puff piece for Astronomer. It's the article-equivalent of the late-night infomercial guy who rips open a bag of potato chips like the hulk because he doesn't have this special tool that's just four easy payments of $9.99.

TTPrograms · 4 years ago
The new TaskFlow API has been part of AirFlow 2.0 since its release in 2020: https://airflow.apache.org/docs/apache-airflow/stable/tutori...
TTPrograms commented on Show HN: Kestra - Open-Source Airflow Alternative   github.com/kestra-io/kest... · Posted by u/tchiotludo
tchiotludo · 4 years ago
I know that that some issues are fixed in Airflow 2, they have made a large improvement with that release. But not all issues is resolved with this one.

The performance issue is still here, just launch Airflow and submit thousand dagruns with simple python sleep(1) and you will hit the cpu bound very quickly with a total time that will have a large duration. Airflow is not designed for a lot of short duration tasks. When using event driving data flow, it's really complicated to managed.

Imagine a flow that will be triggered for each store for example (thousand of store, with 10+ tasks for each one), Airflow will not be able to manage this kind of workflow quickly (and it's not its goals). Airflow was clearly defined to handle small (hundreds tasks) for a long time.

For the XCOM part, Airflow store this in database, so you can't store data into this, you will need to store a small data (database is not here to store big files). In Kestra, we have a provide a storage that allow storing large data (Go, To, ...) between tasks natively with the pain on multiple node clusters.

TTPrograms · 4 years ago
AirFlow 2 was released in 2020. You're saying you knew that these issues were fixed, and then an article is published on your webpage in 2022 knowingly comparing against the technical properties of a major version release 2 years behind? That is not a good look.
TTPrograms commented on Show HN: Kestra - Open-Source Airflow Alternative   github.com/kestra-io/kest... · Posted by u/tchiotludo
tchiotludo · 4 years ago
Airflow have design issue and performance issue, If you want to have some details, you can find some reason on this article: https://kestra.io/blogs/2022-02-22-leroy-merlin-usage-kestra....

For other workflow engine (dagster, prefect, ...), we decided to use a complete different approach on how to build a pipeline. Since others decide to use python code, we decided to go to descriptive language (like terraform for example). This have a lot of advantages on how the developer user experience is: With Kestra, you can directly the web UI in order to edit, create and run your flows, no need to install anything on the user desktop and no need a complex deployment pipeline in order to test on final instance. Other advantage is that it allow to use terraform to deploy your flows, typical development workflow are: on development environment, use the UI, on production deploy your resource with terraform, flow and all the others cloud resource.

After, it will be really nice to have some independent performance benchmark. I really think Kestra is really fast since it was based on a queue system (Kafka) and not a Database. Since workflow are only events (change status, new tasks, ...) that is need to be consume by different service, database don't seems to be a good choice and my benchmark show that Kestra is able to handle a lot of concurrent tasks without using a lot of CPU.

TTPrograms · 4 years ago
FYI some of the Airflow issues are out of date / can be resolved with config changes.

AirFlow 2 is designed to support larger XCOM messages, so the guidance to only use it for small data no longer applies.

Your DAG construction overhead issue is likely due to dagbag refreshing. Airflow checks for DAG changes on a fixed interval, causing a reimport. The default period for that is fairly small, so for large deployments you will want to use a larger period (e.g. at least 5 minutes). I do not know why the default is so short (or was last I checked, anyway). Python files shouldn't do much of note on import regardless IMO.

I am not otherwise familiar with the improvements in Airflow 2, so I cannot say for sure if your other complaints still remain.

TTPrograms commented on Lasers could cut lifespan of nuclear waste from a million years to 30 minutes   bigthink.com/the-present/... · Posted by u/metaph6
yummypaint · 4 years ago
I'm a practicing nuclear physicist and do work on gamma ray interactions with actinides. It's possible i'm missing something important, but i think this article is an utter mess and was clearly written by someone with absolutely no understanding of what they are saying. Attempting to learn anything from this article will be counterproductive.

Here is an explanation of chirped pulse amplification: https://www.rp-photonics.com/chirped_pulse_amplification.htm... This technique is for producing optical photons, which will have less than 10 ev of energy. In the same way that you can't focus the sun's light to a point that gets hotter than the surface of the sun (violates second law of thermodynamics), it isn't obvious how low energy laser pulses can be useful for this. The article offers no explanation whatsoever. Maybe the electric field across the nucleus can be made strong enough to induce scission?

In general, if you want to interact with the nucleus you need photons on the order of 1 Mev or more, whose wavelengths are comparable to the size of the nucleus. These are gamma rays, which are not optical photons. There are ways to boost optical photons to those energies (like inverse compton scattering), but the article says nothing about that either. I would think inverse compton scattering of a chirped pulse from an electron packet in an accelerator will completely destroy the sharp timing and reflect the distribution of the electrons instead.

TTPrograms · 4 years ago
I don't think it's correct to think of a laser as a source in some thermal equilibrium. "Concentrating temperature" passively from sources in thermal equilibrium is forbidden, but there's nothing preventing "concentrating power".

Pulsed lasers bring material interactions into a highly non-linear regime - photon intensity is so high that multiple photon absorption is common. In the typical nuclear decay regime you are concerned with single photon absorption, and the gamma ray intuition is correct. There are also a number of approaches where various targets hit with ultrafast lasers produce controllable flux of gamma rays which are used in downstream experimentation.

TTPrograms commented on 10 Years of Nukemap   blog.nuclearsecrecy.com/2... · Posted by u/Hooke
acidburnNSA · 4 years ago
Bioweapons are increasingly more likely to be the end of humanity. With fancier tools like CRISPR you're getting closer and closer to having one person in a basement making something that can propagate across the whole world. Nuclear weapons are inherently limited by the fact that you need huge institutions to generate and deliver large numbers of them, and without large numbers of them you can't destroy humanity because of the large size of the planet.

https://cco.ndu.edu/Portals/96/Documents/prism/prism_4-4/Str...

TTPrograms · 4 years ago
Have we found diseases that could propagate across the world either (1) with virality greatly in excess of modern diseases, i.e. R0 >> omicron or (2) untreatable/curable and potentially latent for weeks to months?

A disease capable of coming close to ending civilization would need to have properties far beyond any disease observed so far. Either it needs to infect massive populations before we detect it, or it has to transmit over long distances (miles) despite e.g. moderate precautions like masking, air filtering. I think there's good reason to doubt such a pathogen could exist. The closest I could imagine would be an HIV-like immunodeficiency virus that can be transmitted via aerosol - but even that would have to cause disease much more severe than HIV without resistance among even .01% of the population.

TTPrograms commented on Feds arrest couple, seize $3.6B in hacked Bitcoin funds   washingtonpost.com/nation... · Posted by u/mikeyouse
fxtentacle · 4 years ago
Shouldn't all true crypto believers hate this news?

It's the government trying to enforce their opinion of who should own those Bitcoins, thereby taking power away from the owner that the network has decided on, which would be "whoever has the cryptographic keys".

TTPrograms · 4 years ago
Obvious no-true-scotsman. Believing that the goal of crypto is to circumvent laws regarding possession and theft is at most a fringe belief. The fact that this is at the top of HN demonstrates how devoid of merit crypto discussion here is.
TTPrograms commented on New material that can absorb and release enormous amounts of energy   phys.org/news/2022-02-sci... · Posted by u/prostoalex
MengerSponge · 4 years ago
The length scale of a metamaterials' features should be complementary to the length scale the metamaterial is acting on.

Kind of squirrely, and I tried really hard to phrase that so it isn't a tautology. But if you're dealing with radio waves, your metamaterial can have huge (meter-scale) features. If you're dealing with visible light, your feature size is on the hundreds of nanometer scale.

Thin films have a characteristic bending length: https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.11..., and this determines the size of features you should pattern to exploit that bending/folding interaction.

TTPrograms · 4 years ago
I think in mechanical metamaterials the characteristic length defining the "metamaterial region" is rather the wavelength of pressure waves in the material you're considering - much like in electromagnetics you want the patterning (cell) length to be much less than the wavelength of radiation. In work like this they are effectively looking at 0.1 Hz or lower - near static loading - so I think pattern size can be quite large (around 600 m wavelength in bulk rubber for 0.1 Hz). This interpretation also replicates the localized behavior in the shock experiment videos. When the platform is dropped an impulse is applied with frequencies above the metamaterial regime for the material, so you see highly asymmetric response through the material - implying that the macroscopic "metamaterial" property characterization is insufficient to predict response, and so analysis must be done at feature scales rather than wavelength scales. The idea being that a "metamaterial" is a structure that can be treated as a bulk continuous material with a particular defined response as long as the interacting frequencies are all sufficiently low (far below the characteristic wavelength of the material).

I think the bending analysis you cite can determine the relative feature sizes desirable for certain "micro-scale" mechanical behavior, but it's possible to build a mechanical "metamaterial" much larger than that as well.

TTPrograms commented on Lab Leak 2.0?   bprice.substack.com/p/lab... · Posted by u/howaboutnope
TTPrograms · 4 years ago
The sample distribution of viruses is incredibly important for this sort of analysis, and much of the argument here only makes sense through the lens of uniform virus sequencing. If you have imbalanced sequencing and imbalanced transmission you can also explain these differences.

The important thing is that mutations occur at a certain rate per virus per unit time. If you have an isolated population that's sequenced infrequently then (1) that strain will appear to evolve more slowly as there's a smaller population capable of mutating, and (2) once that strain is sequenced it's going to look far from what you've seen already since you haven't been tracking the intermediate mutations in this population.

The S/N ratio can be analyzed in terms of a random walk in high dimension. Variance in these walks grows over time (in terms of distance from origin, i.e. number of mutations), so the discrepancy doesn't seem super far from what's plausible under the null hypothesis. Perhaps someone can do the math on that.

The hypothesis merits further investigation, but the strength of the evidence presented here really requires some complex statistical analysis to determine if innocuous explanations fit. The analysis is far more complex than I would expect an epidemiologist or virologist to apply in the course of their work.

u/TTPrograms

KarmaCake day1676January 4, 2014
About
Background in Physics, EECS, Math. Currently working in algorithms development for compressive sensing and image processing with interests in optimization based approaches.

If it's up to me I'll write a 5 page math derivation before every 50 lines of code - but that code will do some crazy stuff.

View Original