Readit News logoReadit News
jbverschoor · 3 years ago
So basically you run an endless script to fetch https://www.tesla.com/sites/default/settings.php and hope that some day there will be a minor nginx config error which lets you download the php source instead of executing it.

This will happen some day, so invest 5 bucks per month to exploit Tesla at a certain point, so maybe you can be first in line for the Cybertruck :-)

rvnx · 3 years ago
This seems to be a too sophisticated attack, sometimes simplicity is better: https://samcurry.net/cracking-my-windshield-and-earning-1000...
walrus01 · 3 years ago
Time to try naming your tesla "drop table vehicles;"
j-bos · 3 years ago
Great read

Deleted Comment

grubby · 3 years ago
this was such a great read, people like you make me want to learn more and more everyday
grumple · 3 years ago
Pretty sure every site on IPv4 gets probed multiple times a day for common config leaks and other misconfigurations. Happens to all of mine.
jbverschoor · 3 years ago
Yeah, but if a gitignore tells you where to look for, and it isn't even blocked by a WAF / rule, it makes an interesting target, esp. one of the largest companies out there.

You shouldn't even be able to execute settings.php

c7DJTLrn · 3 years ago
Finally, a compelling reason to use IPv6.
TechBro8615 · 3 years ago
This comment transported me back to 2010 or thereabouts when this happened to Facebook. I remember being surprised at the simplicity of the code and making a lot of jokes about "build a facebook clone" ads on freelance websites.
rbanffy · 3 years ago
I am sure there are lots of automated scripts doing precisely that with pretty much every company that has a website.

I used to keep a hall of shame on my main site, because looking for "settings.php" or "global.asa" on a Zope site was just silly.

retrocryptid · 3 years ago
Except that you'll find that error long before the cybertruck ships. Heck, you'll probably see the rebirth of NFTs and BTC over US$40000 before the cybertruck ships.
tomjakubowski · 3 years ago
Interesting, the exclude file (actually, everything under .git/info) 403s, while .git/index is a 404.

- https://www.tesla.com/.git/info/exclude

- https://www.tesla.com/.git/index

README.txt 403s too. https://www.tesla.com/README.txt

edit: just going to add files I've found here:

- https://www.tesla.com/.editorconfig

- https://www.tesla.com/profiles/README.txt

TechTechTech · 3 years ago
Merlin04 · 3 years ago
bumblewax · 3 years ago
Two space tabs, nice.
jahsome · 3 years ago
Add a trailing slash to index and it 403s
retrocryptid · 3 years ago
sigh
retrocryptid · 3 years ago
really? five down-votes because I sighed?

If you're going to down-vote me, down-vote me because I mentioned Elon is a human being, with human flaws and human strengths and not the resurrection of Supply-Side-Jesus.

ericmcer · 3 years ago
A companies marketing website and their actual products have little in common. I would be surprised if any engineers even work on the marketing website and blown away if it is co-located with something sensitive.
FormerBandmate · 3 years ago
https://xkcd.com/932/

Ffs, a tech forum should be better than this

wink · 3 years ago
No, it's a valid complaint - I've seen it at several companies that the development team were eager to present a professional website (so that anyone in the know looking would not find such embarrassing stuff and maybe scare off potential new hires or customers) but it ended up in the hands of the marketing department. To the degree that the infra was moved to a different domain so the wordpress install at "www.example.com" could never even remotely do anything with cookies at "example.net" - but yes, that might have been a tad paranoid ;)

I think the person you were replying to was not playing down the thing that happened, but explained exactly what the cartoon said. It not being important for the general public does mean it's not a probable fact.

ranman · 3 years ago
I would judge a vendor or consulting firm based on their marketing website. Why wouldn't I judge a car maker?
anonym29 · 3 years ago
If you think .gitignore leaks too much info, you're going to love https://www.tesla.com/robots.txt
soneil · 3 years ago
The start/stop at the bottom makes that look like it's come canned with a CMS and they've just tacked on what they needed to. It's 90% boilerplate.
chx · 3 years ago
It's hardly a secret tesla.com is Drupal -- both that gitignore and the robots.txt shouts it quite loudly, to be fair. One of the larger Drupal agencies, Lullabot includes them in their clients list: https://www.lullabot.com/our-work and they are looking for a sr backend Drupal engineer https://www.tesla.com/careers/search/job/sr-software-enginee... which I would take if the company were not lead by Musk.
Neil44 · 3 years ago
And the bumph at the top - crawlers run by Yahoo! and Google - lol
jongjong · 3 years ago
If that's all the dirt that thousands of vengeful fired Twitter ex-employees could find, then Tesla must have excellent security.
bakugo · 3 years ago
Yeah this screams complete and utter desperation. Like, I get that hating Elon is what all the cool kids at school are doing this month but do we really need this immature garbage on the front page of HN all day?
Hamuko · 3 years ago
Well, I'd personally at least find some hilarity in being a Twitter engineer fired by one of those 10x Tesla engineers while they're publishing their .gitignore files via HTTPS (which probably means that their Nginx configuration is fucked).
jasonvorhe · 3 years ago
People are just having some fun.
threatripper · 3 years ago
This looks like a default file from a Drupal installation: https://api.drupal.org/api/drupal/robots.txt/7.x
m00x · 3 years ago
Really doesn't leak much, and robot.txt is supposed to be accessible from the internet.
anonym29 · 3 years ago
Yes, it's meant to be public, but you need not disclose all of what is contained inside of it. I've been on many pentests where paths provided by robots.txt, that I wouldn't have obtained any other way, led to exploitable vulnerabilities.

For some reason, a considerable number of people don't seem to think twice about adding sensitive paths to robots.

marginalia_nu · 3 years ago
Did an inventory based on my crawler data a while back.

Relatively common to find sensitive or embarassing links singled out in robots.txt

Especially in old large organizations, like universities.

slaymaker1907 · 3 years ago
Apparently Tesla is FOSS, see https://www.Tesla.com.
Ptchd · 3 years ago
Where can I get the FSD (Fake Self Driving) source code?
tacker2000 · 3 years ago
Its just random cms bs. Nothing to hate elon about
reaperducer · 3 years ago
If you think .gitignore leaks too much info, you're going to love https://www.tesla.com/robots.txt

I wonder if these are some of the same people that Musk brought in to refactor Twitter.

Deleted Comment

hackernewds · 3 years ago
Imagine the guy at the helm here is now responsible for the most sensitive DMs of premiers and state leaders
AtNightWeCode · 3 years ago
Wow, top score for uniqueness, in the field of being stupid...
AtNightWeCode · 3 years ago
LOL, why, just wow.
madmod · 3 years ago
I found a bug in the tesla model 3 reservation system that allowed anyone to get a reservation for free. Reported it via hackerone (or maybe it was bugcrowd dont remember) and got told it was of no consequence and would be filtered out later or something. Got no bounty for hours of work.

I accidentally ordered my model 3 with a free reservation, not the one I actually paid for.

jonathanyc · 3 years ago
Given that people are selling reservations for thousands of dollars, I think you deserved something for reporting the issue. But I suppose being a hardcore engineer means never having to say you're sorry.
revskill · 3 years ago
So, should we just add .gitignore to .gitignore and problem solved ?
kadoban · 3 years ago
You're joking of course, but that likely won't do anything useful.

If it's tracked, then ignore has no effect. If it's not tracked, then you might as well use .git/info/excludes which is pretty much the same thing but not tracked, or you can use a global excludes file, like ~/.gitignore is common (you have to configure git to point at it, iirc).

It _could_ make sense to ignore the .gitignore if some other tool is parsing and using that file, but that pattern is...troublesome so I hope not.

vbezhenar · 3 years ago
~/.config/git/ignore
agumonkey · 3 years ago
the classic https://news.ycombinator.com/item?id=31420268

> Git ignores .gitignore with .gitignore in .gitignore

manojlds · 3 years ago
.gitignore to Dockerignore

(Partly joking)

alvis · 3 years ago
No. You never checkout a site directly from git to begin with. You don't let other people know what files are ignored from git doesn't mean people cannot access them. :/
teknopaul · 3 years ago
Nonsense.

Everyone uses git for source control, of course you check out a site with git.

All you are telling people with a .gitingore is what is _not_ available.

It means exactly that people can not access them if your site is a checkout, because they aren't there.

hankchinaski · 3 years ago
I like the simplicity and pragmatism of using drupal. I wouldn’t work with it myself but it was probably the cheapest/fastest way to get a similar site up and running
throwaway6734 · 3 years ago
If you stick completely within the Drupal"standard path", it's a great way to get a site up and running. Once you step outside of that path it's an absolute misery
jpoesen · 3 years ago
Dunking on a tech while using a throwaway account and not providing details on why you find it absolute misery... not very useful or trustworthy.
rbanffy · 3 years ago
Indeed. I'd use Plone, but it's overkill for a website like this.
behnamoh · 3 years ago
Can someone explain why this is leaky and how it can be exploited by malicious actors?
anonym29 · 3 years ago
It's leaky because it's globally accessible and provides information that isn't otherwise readily apparent.

There is no guarantee that an exposed .gitignore (or other exposed files, like .htaccess, robots.txt, etc) will be exploitable, but they aid in the discovery process and may help adversaries uncover exploitable vulnerabilities they might have otherwise missed.

At the extreme, I've seen paths of backups of the production database listed in a publicly readable .gitignore, and that database backup was publicly accessible, too.

Most of the time, nothing sensitive is revealed, but defense in depth suggests it's better to not upload files like these to your web server unless they're being used by the webserver (like .htaccess) or crawlers (like robots.txt), and if you do, they ought to not be publicly readable (unless intended, like robots.txt), but even then, you'd want to make sure nothing sensitive is in any file like that which is publicly readable. Even if there's nothing sensitive in them now, there's no guarantee that nothing sensitive will ever be added.

oceanplexian · 3 years ago
I'm gonna give my counter take. Information disclosure is something that the DevSecOps(tm) crowd spends a disproportionate amount of time on for little benefit. The number of security professionals who don't know how to code, but learned Nessus or CrowdStrike and criticize others is too damn high.

I had to work with a security team in a FAANG for several years. They were so high and mighty with their low sev vulnerabilities, but they never improved security, and refused to acknowledge recommendations from the engineers working on systems that needed to be rearchitected due to a fundamental problems with networking, security boundaries, root of trust, etc. Unsurprisingly, their "automated scanner" failed to catch something a SRE would have spotted in 5 minutes, and the place got owned in a very public and humiliating way.

When I see things like this it brings back memories of that security culture. Frankly I think Infosec is deeply broken and gawking over a wild .gitignore is a perfect example of that.

kadoban · 3 years ago
It's a bit of an information leak, but probably not a particularly serious one. It just gives some information about what tech stack they're using, which isn't really public but also not that hard to find out, and maybe a bit about where an attacker would want to look for other sensitive stuff. Pretty minor really, on its own.

It is a bit embarrassing because most web servers (and deployment setups) shouldn't be publishing/serving dot files anyway (files with names beginning with dot). But it's not necessarily a problem as long as they have some protection to avoid the _really_ sensitive stuff leaking, it's just kind of funny.

rvnx · 3 years ago
This shows that the teams in charge of code deployment have relatively weak quality control.

In practice, it means that if the gitignore file is leaked, that there is a substantial risk that they accidentally leak the .git folder someday.

The .git folder indirectly contains downloadable copies of the source-code of the website, which could very likely lead to credentials leak or compromised services.

Your life can depend on Tesla.com services.

Even if you are the pedestrian side.

extheat · 3 years ago
What makes you think that there is some "substantial risk"? You seem to be mixing together git repos and site deployment rules. I don't see the big deal here with some CMS leftovers being deployed, but yes from a perspective of correctness this is not something that needs to be deployed.
mlindner · 3 years ago
> This shows that the teams in charge of website code deployment have relatively weak quality control.

FTFY. Little of Tesla's software is whatever they're using on the website. That'd be like judging Apple OS software by their website source.

drexlspivey · 3 years ago
So basically everyone’s life is at risk because the .gitignore got leaked. That sounds reasonable.
bpodgursky · 3 years ago
I'd be pretty surprised if the marketing / landing site was remotely connected to the user portal. Most companies have a marketing-friendly CMS for public content, disconnected from the actual customer-facing portal.
diogenesjunior · 3 years ago
what makes you think the tesla.com website is where they keep their real code lol?
bobthepanda · 3 years ago
The gitignore explicitly called out where the sensitive settings file is, so presumably that makes it a lot easier to figure out where to start injecting bad code
Alupis · 3 years ago
Sure, but this appears like some very standard directories for popular website CMS platforms like Drupal.

So, not very surprising and probably doesn't really tip anyone towards anything particularly special.

m00x · 3 years ago
It's probably caused by an incorrect nginx configuration, which means other static files may be exposed.

Otherwise, it's not much of a leak.

diogenesjunior · 3 years ago
you could theoretically social engineer until you find something to exploit

ie, if the file said to ignore "/site/adminpasswords.txt" then you could go to /site/adminpasswords.txt and reveal admin passwords. this is obviously a simple eli5 explanation but i hope it helps

however, i doubt the tesla.com website is where they keep any important code that relates to actual tesla software like we would see used in cars. that would be like the army having their real code for their software/systems at goarmy.com lol

mlindner · 3 years ago
It's not really leaky and can't be exploited by anyone. It's an interesting curiosity at best.

Deleted Comment

Dead Comment