Readit News logoReadit News
denial commented on The Tale of Daniel   hillelwayne.com/post/tale... · Posted by u/hwayne
denial · 2 years ago
As do I.
denial commented on Mamba: The Easy Way   jackcook.com/2024/02/23/m... · Posted by u/jackcook
pama · 2 years ago
Not sure how much detail you need but generally there exist implicit and explicit integrators for numerically solving (integrating) ODE. The implicit ones, like the one used here, tend to be more stable. The ideas behind SSM come from control theory ideas that used integrators with stability guarantees so that the rest of the neural network can focus on other aspects of the problem.
denial · 2 years ago
That's a helpful pointer. Thank you.
denial commented on Mamba: The Easy Way   jackcook.com/2024/02/23/m... · Posted by u/jackcook
denial · 2 years ago
Something minor I always wonder about when I read Mamba is the discretization.

All of the sources I see referred to as derivations of it have a discretization of the form

h_t =Ah_{t-1} + Bx_{t-1} for the first line instead of the given one of the form h_t =Ah_{t-1} + Bx_t.

Does anyone know why this is?

denial commented on Tacit Knowledge Is Dangerous   er4hn.info/blog/2023.08.2... · Posted by u/Wingy
denial · 2 years ago
This is an odd definition for tacit knowledge. My understanding is that it's more caught up in the intuition of concepts/systems that's difficult to codify because it's very contextual. This "tribal knowledge" perspective seems more like processes and facts that aren't documented. Not because of the inherent difficulty but instead because of priorities. Not that there isn't some intersection between the two.
denial commented on Goldman Sachs reportedly said Apple Card savings account was a mistake   9to5mac.com/2023/10/16/ap... · Posted by u/ksec
trevortheblack · 2 years ago
Apologies to ask off topic question, but what is HENRY? I've never heard of this before.
denial · 2 years ago
High earner, not rich yet.
denial commented on Linear algebra for programmers   coffeemug.github.io/spakh... · Posted by u/coffeemug
Corsome · 2 years ago
> This is actually not so strange– you can think of many structures as functions. For example, you can think of a number 3 as a function. When you multiply it by things, it makes them three times bigger.

I don't see how 3 can be a function from this example. "3*" (partially applied multiplication by 3) looks more like it.

Matrices and vectors as functions? Yeah, if the argument is within bounds. That makes it just an indexing operation.

(I guess one can view 3 as a one element vector but that sounds like a degenerate case)

Or maybe I'm missing something...?

denial · 2 years ago
I take it as analogous to the association of a matrix to a linear transformation. This association is via multiplication.
denial commented on Dementia risk linked to blood-protein imbalance in middle age   nature.com/articles/d4158... · Posted by u/pseudolus
denial · 3 years ago
(Paywalled link of paper: https://www.science.org/doi/10.1126/scitranslmed.adf5681)

Abstract:

A diverse set of biological processes have been implicated in the pathophysiology of Alzheimer’s disease (AD) and related dementias. However, there is limited understanding of the peripheral biological mechanisms relevant in the earliest phases of the disease. Here, we used a large-scale proteomics platform to examine the association of 4877 plasma proteins with 25-year dementia risk in 10,981 middle-aged adults. We found 32 dementia-associated plasma proteins that were involved in proteostasis, immunity, synaptic function, and extracellular matrix organization. We then replicated the association between 15 of these proteins and clinically relevant neurocognitive outcomes in two independent cohorts. We demonstrated that 12 of these 32 dementia-associated proteins were associated with cerebrospinal fluid (CSF) biomarkers of AD, neurodegeneration, or neuroinflammation. We found that eight of these candidate protein markers were abnormally expressed in human postmortem brain tissue from patients with AD, although some of the proteins that were most strongly associated with dementia risk, such as GDF15, were not detected in these brain tissue samples. Using network analyses, we found a protein signature for dementia risk that was characterized by dysregulation of specific immune and proteostasis/autophagy pathways in adults in midlife ~20 years before dementia onset, as well as abnormal coagulation and complement signaling ~10 years before dementia onset. Bidirectional two-sample Mendelian randomization genetically validated nine of our candidate proteins as markers of AD in midlife and inferred causality of SERPINA3 in AD pathogenesis. Last, we prioritized a set of candidate markers for AD and dementia risk prediction in midlife.

denial commented on Hilbert Transform   electroagenda.com/en/hilb... · Posted by u/topsycatt
denial · 3 years ago
Singular integrals are a lovely topic, especially from a Fourier analytic perspective. Taking the Fourier transform, F(H(f))(x) = -i * sgn(x) * F(f)(x), which implies H^2 = - I. H has the Fourier multiplier -i * sgn(x). The Riesz transforms R_j are a higher dimensional generalization of the Hilbert transform with Fourier multipliers -i * x_j/|x_j|, which leads to the nice property sum R_j^2 = -I.
denial · 3 years ago
I found a small error I can't edit into this comment: the Fourier multiplier of R_j is -i * x_j/|x|.
denial commented on Hilbert Transform   electroagenda.com/en/hilb... · Posted by u/topsycatt
denial · 3 years ago
Singular integrals are a lovely topic, especially from a Fourier analytic perspective. Taking the Fourier transform, F(H(f))(x) = -i * sgn(x) * F(f)(x), which implies H^2 = - I. H has the Fourier multiplier -i * sgn(x). The Riesz transforms R_j are a higher dimensional generalization of the Hilbert transform with Fourier multipliers -i * x_j/|x_j|, which leads to the nice property sum R_j^2 = -I.

Deleted Comment

u/denial

KarmaCake day226October 2, 2020View Original