Readit News logoReadit News
RoxasShadow commented on Ask HN: Who is hiring? (January 2017)    · Posted by u/whoishiring
guessmyname · 9 years ago
"We do not care about your academic degrees [...]"

Unfortunately embassies do care about these. Getting a work permit to immigrate to another country without an university degree certainly reduces the chances to get the embassy's approval, even in countries like Germany where there is some flexibility — if you have a Senior profile — a degree certainly accelerates the process.

In fact, in this industry, there is less people caring about an university degree than people who care about it, exceptions being those in the research field. But HR departments always filter out good candidates because they lack the degree, most HR employees are not trained correctly and/or know how difficult it is to hire a foreigner without a formal education background so they immediately throw good resumes to the trash can because of that.

RoxasShadow · 9 years ago
I work for a company called Honeypot. We have placed many non-EU developers with no degrees in jobs in startups in Germany and the Netherlands.

If you have no degree, you just need to prove that you have sufficient skills or work experience. It's more paperwork for the company, but for them it's usually worth it.

If you want more info, I can ask the team who looks after it.

RoxasShadow commented on Rails 5.0.0.beta1.1 released   weblog.rubyonrails.org/20... · Posted by u/aaronbrethorst
RoxasShadow · 10 years ago
This is more than a simple upgrade for Rails 5, since it includes several high-priority security patches for Rails 5.0, 4.2, 4.1, 3.2.

Would be great if you can specify this in the title.

RoxasShadow commented on Show HN: EDB – A framework to make and manage backups of your database   github.com/RoxasShadow/ED... · Posted by u/RoxasShadow
falcolas · 10 years ago
The mysql driver needs additional work. In particular, it needs a '--single-transaction' flag, or a global lock, to ensure that the dump is consistent - particularly if you want to dump multiple databases concurrently.

Doing one dump per table with chunking (each file has N rows) would help with both speed and disk sizes of backups by allowing S3 or some other program to implement de-duplication between incremental backups.

It also wouldn't hurt to capture the binlog position, if available, to enable point in time recovery.

Have a look at mydumper for an idea of how another tool implemented these:

https://launchpad.net/mydumper

RoxasShadow · 10 years ago
I will open an issue with your suggestion. Thanks.

u/RoxasShadow

KarmaCake day17May 19, 2015
About
[ my public key: https://keybase.io/roxasshadow; my proof: https://keybase.io/roxasshadow/sigs/haaGCTgwHgSa3p_H91nJUP_B2tsS2LPXg0xJ1pTOPZU ]
View Original