Readit News logoReadit News
FreeHugs commented on x (@x) / Twitter   twitter.com/i/flow/login?... · Posted by u/FreeHugs
FreeHugs · 3 years ago
What happened to the real x, "gene x", "photographer and friend of lemurs"?

https://web.archive.org/web/20220906224326/https://twitter.c...

FreeHugs commented on Ask HN: What happened to Web3 startups?    · Posted by u/Freddie111
FreeHugs · 3 years ago
At the core of Web3 projects is the idea that one can own their identity via cryptographic proof.

Over time, this concept will find its way into many projects. Especialy into open source projects which try to make the web a better place.

For example, as soon as browsers support a DNS based on cryptographic proofs like ENS, other technologies which have URL based identities (like ActivityPub) will automatically support cryptographic identities. Which would bring it to projects like Mastodon and Bluesky automatically.

FreeHugs commented on Microsoft is integrating Copilot into Windows 11   engadget.com/microsoft-wi... · Posted by u/agomez314
FreeHugs · 3 years ago
The big question is if this will be a net positive or just adds more bloat.

They tried the same with Bing and failed. Bing did not see any uptick even with the new AI features:

https://gs.statcounter.com/search-engine-market-share/deskto...

My takeaway is that users are not as stupid and lazy as Microsoft thinks they are. They want a clean UI. And if they have to sign up for ChatGPT to get it, they will. ChatGPT already gained as many users as Bing has. With a clean UI, Bing might have already doubled it's userbase.

Why doesn't Microsft just remove all the clutter from Bing and enjoy the benefits of having their search engine being the default on all the Windows tablets and laptops out there? Why waste it all and make every user change their search engine to Google?

FreeHugs commented on This blog is hosted on my Android phone   androidblog.a.pinggy.io/... · Posted by u/thunderbong
FreeHugs · 3 years ago
Why is Termux not on the Google Play Store?
FreeHugs commented on A visual book recommender   nathanrooy.github.io/post... · Posted by u/squidhunter
FreeHugs · 3 years ago
Reminds me of https://www.literature-map.com

Which is a map of all authors in the world sorted by overlap in readership. I found some of my favorite writers by browsing it.

I wonder which approach is better suited to find something that is spot on to my interests.

When I think of my favorite books, they usually are the most popular books of their authors.

Are there any counterexamples, where an author wrote a book that is more profound than their biggest hit but got overlooked for some reason?

FreeHugs commented on Making Python faster with Rust   ohadravid.github.io/posts... · Posted by u/teddykoker
winrid · 3 years ago
Python's for loop implementation is slow, also. You can use built in utils like map() which are "native" and can be a lot faster than a for loop with a push:

https://levelup.gitconnected.com/python-performance-showdown...

FreeHugs · 3 years ago
I don't think it's the loop implementation. The stuff in the loop should take multiple orders of magnitude more time than the loop itself:

    for poly in polygon_subset:
        if np.linalg.norm(poly.center - point) < max_dist:
            close_polygons.append(poly)

FreeHugs commented on Making Python faster with Rust   ohadravid.github.io/posts... · Posted by u/teddykoker
hoseja · 3 years ago
The final code takes just 2.90ms per iteration.
FreeHugs · 3 years ago
The rest is not a fair comparison, because it rewrites the used libraries, not the application code.

You can always speed up an application if you rewrite the used libraries to match your specific use case.

FreeHugs commented on Making Python faster with Rust   ohadravid.github.io/posts... · Posted by u/teddykoker
nickstinemates · 3 years ago
One carries the entire feature set of the python runtime, the other is compiled.
FreeHugs · 3 years ago
The time is spent in this 3-line loop:

    for poly in polygon_subset:
        if np.linalg.norm(poly.center - point) < max_dist:
            close_polygons.append(poly)
I don't think the entire feature set of the Python runtime is involved in this.

FreeHugs commented on Making Python faster with Rust   ohadravid.github.io/posts... · Posted by u/teddykoker
FreeHugs · 3 years ago
The most important part of the article seems to be that this Python code is taking "an avg of 293.41ms per iteration":

        def find_close_polygons(
            polygon_subset: List[Polygon], point: np.array, max_dist: float
        ) -> List[Polygon]:
            close_polygons = []
            for poly in polygon_subset:
                if np.linalg.norm(poly.center - point) < max_dist:
                    close_polygons.append(poly)

            return close_polygons
And after replacing it with this Rust code, it is taking "an avg of 23.44ms per iteration":

        use pyo3::prelude::*;
        use ndarray_linalg::Norm;
        use numpy::PyReadonlyArray1;

        #[pyfunction]
        fn find_close_polygons(
            py: Python<'_>,
            polygons: Vec<PyObject>,
            point: PyReadonlyArray1<f64>,
            max_dist: f64,
        ) -> PyResult<Vec<PyObject>> {
            let mut close_polygons = vec![];
            let point = point.as_array();
            for poly in polygons {
                let center = poly
                    .getattr(py, "center")?
                    .extract::<PyReadonlyArray1<f64>>(py)?
                    .as_array()
                    .to_owned();

                if (center - point).norm() < max_dist {
                    close_polygons.push(poly)
                }
            }

            Ok(close_polygons)
        }
Why is the Rust version 13x faster than the Python version?

u/FreeHugs

KarmaCake day985August 26, 2015View Original