So basically you run an endless script to fetch https://www.tesla.com/sites/default/settings.php and hope that some day there will be a minor nginx config error which lets you download the php source instead of executing it.
This will happen some day, so invest 5 bucks per month to exploit Tesla at a certain point, so maybe you can be first in line for the Cybertruck :-)
Yeah, but if a gitignore tells you where to look for, and it isn't even blocked by a WAF / rule, it makes an interesting target, esp. one of the largest companies out there.
You shouldn't even be able to execute settings.php
This comment transported me back to 2010 or thereabouts when this happened to Facebook. I remember being surprised at the simplicity of the code and making a lot of jokes about "build a facebook clone" ads on freelance websites.
Except that you'll find that error long before the cybertruck ships. Heck, you'll probably see the rebirth of NFTs and BTC over US$40000 before the cybertruck ships.
If you're going to down-vote me, down-vote me because I mentioned Elon is a human being, with human flaws and human strengths and not the resurrection of Supply-Side-Jesus.
A companies marketing website and their actual products have little in common. I would be surprised if any engineers even work on the marketing website and blown away if it is co-located with something sensitive.
No, it's a valid complaint - I've seen it at several companies that the development team were eager to present a professional website (so that anyone in the know looking would not find such embarrassing stuff and maybe scare off potential new hires or customers) but it ended up in the hands of the marketing department. To the degree that the infra was moved to a different domain so the wordpress install at "www.example.com" could never even remotely do anything with cookies at "example.net" - but yes, that might have been a tad paranoid ;)
I think the person you were replying to was not playing down the thing that happened, but explained exactly what the cartoon said. It not being important for the general public does mean it's not a probable fact.
It's hardly a secret tesla.com is Drupal -- both that gitignore and the robots.txt shouts it quite loudly, to be fair. One of the larger Drupal agencies, Lullabot includes them in their clients list: https://www.lullabot.com/our-work and they are looking for a sr backend Drupal engineer https://www.tesla.com/careers/search/job/sr-software-enginee... which I would take if the company were not lead by Musk.
Yeah this screams complete and utter desperation. Like, I get that hating Elon is what all the cool kids at school are doing this month but do we really need this immature garbage on the front page of HN all day?
Well, I'd personally at least find some hilarity in being a Twitter engineer fired by one of those 10x Tesla engineers while they're publishing their .gitignore files via HTTPS (which probably means that their Nginx configuration is fucked).
Yes, it's meant to be public, but you need not disclose all of what is contained inside of it. I've been on many pentests where paths provided by robots.txt, that I wouldn't have obtained any other way, led to exploitable vulnerabilities.
For some reason, a considerable number of people don't seem to think twice about adding sensitive paths to robots.
I found a bug in the tesla model 3 reservation system that allowed anyone to get a reservation for free. Reported it via hackerone (or maybe it was bugcrowd dont remember) and got told it was of no consequence and would be filtered out later or something. Got no bounty for hours of work.
I accidentally ordered my model 3 with a free reservation, not the one I actually paid for.
Given that people are selling reservations for thousands of dollars, I think you deserved something for reporting the issue. But I suppose being a hardcore engineer means never having to say you're sorry.
You're joking of course, but that likely won't do anything useful.
If it's tracked, then ignore has no effect. If it's not tracked, then you might as well use .git/info/excludes which is pretty much the same thing but not tracked, or you can use a global excludes file, like ~/.gitignore is common (you have to configure git to point at it, iirc).
It _could_ make sense to ignore the .gitignore if some other tool is parsing and using that file, but that pattern is...troublesome so I hope not.
No. You never checkout a site directly from git to begin with.
You don't let other people know what files are ignored from git doesn't mean people cannot access them. :/
I like the simplicity and pragmatism of using drupal. I wouldn’t work with it myself but it was probably the cheapest/fastest way to get a similar site up and running
If you stick completely within the Drupal"standard path", it's a great way to get a site up and running. Once you step outside of that path it's an absolute misery
It's leaky because it's globally accessible and provides information that isn't otherwise readily apparent.
There is no guarantee that an exposed .gitignore (or other exposed files, like .htaccess, robots.txt, etc) will be exploitable, but they aid in the discovery process and may help adversaries uncover exploitable vulnerabilities they might have otherwise missed.
At the extreme, I've seen paths of backups of the production database listed in a publicly readable .gitignore, and that database backup was publicly accessible, too.
Most of the time, nothing sensitive is revealed, but defense in depth suggests it's better to not upload files like these to your web server unless they're being used by the webserver (like .htaccess) or crawlers (like robots.txt), and if you do, they ought to not be publicly readable (unless intended, like robots.txt), but even then, you'd want to make sure nothing sensitive is in any file like that which is publicly readable. Even if there's nothing sensitive in them now, there's no guarantee that nothing sensitive will ever be added.
I'm gonna give my counter take. Information disclosure is something that the DevSecOps(tm) crowd spends a disproportionate amount of time on for little benefit. The number of security professionals who don't know how to code, but learned Nessus or CrowdStrike and criticize others is too damn high.
I had to work with a security team in a FAANG for several years. They were so high and mighty with their low sev vulnerabilities, but they never improved security, and refused to acknowledge recommendations from the engineers working on systems that needed to be rearchitected due to a fundamental problems with networking, security boundaries, root of trust, etc. Unsurprisingly, their "automated scanner" failed to catch something a SRE would have spotted in 5 minutes, and the place got owned in a very public and humiliating way.
When I see things like this it brings back memories of that security culture. Frankly I think Infosec is deeply broken and gawking over a wild .gitignore is a perfect example of that.
It's a bit of an information leak, but probably not a particularly serious one. It just gives some information about what tech stack they're using, which isn't really public but also not that hard to find out, and maybe a bit about where an attacker would want to look for other sensitive stuff. Pretty minor really, on its own.
It is a bit embarrassing because most web servers (and deployment setups) shouldn't be publishing/serving dot files anyway (files with names beginning with dot). But it's not necessarily a problem as long as they have some protection to avoid the _really_ sensitive stuff leaking, it's just kind of funny.
This shows that the teams in charge of code deployment have relatively weak quality control.
In practice, it means that if the gitignore file is leaked, that there is a substantial risk that they accidentally leak the .git folder someday.
The .git folder indirectly contains downloadable copies of the source-code of the website, which could very likely lead to credentials leak or compromised services.
What makes you think that there is some "substantial risk"? You seem to be mixing together git repos and site deployment rules. I don't see the big deal here with some CMS leftovers being deployed, but yes from a perspective of correctness this is not something that needs to be deployed.
I'd be pretty surprised if the marketing / landing site was remotely connected to the user portal. Most companies have a marketing-friendly CMS for public content, disconnected from the actual customer-facing portal.
The gitignore explicitly called out where the sensitive settings file is, so presumably that makes it a lot easier to figure out where to start injecting bad code
you could theoretically social engineer until you find something to exploit
ie, if the file said to ignore "/site/adminpasswords.txt" then you could go to /site/adminpasswords.txt and reveal admin passwords. this is obviously a simple eli5 explanation but i hope it helps
however, i doubt the tesla.com website is where they keep any important code that relates to actual tesla software like we would see used in cars. that would be like the army having their real code for their software/systems at goarmy.com lol
This will happen some day, so invest 5 bucks per month to exploit Tesla at a certain point, so maybe you can be first in line for the Cybertruck :-)
Deleted Comment
You shouldn't even be able to execute settings.php
I used to keep a hall of shame on my main site, because looking for "settings.php" or "global.asa" on a Zope site was just silly.
- https://www.tesla.com/.git/info/exclude
- https://www.tesla.com/.git/index
README.txt 403s too. https://www.tesla.com/README.txt
edit: just going to add files I've found here:
- https://www.tesla.com/.editorconfig
- https://www.tesla.com/profiles/README.txt
If you're going to down-vote me, down-vote me because I mentioned Elon is a human being, with human flaws and human strengths and not the resurrection of Supply-Side-Jesus.
Ffs, a tech forum should be better than this
I think the person you were replying to was not playing down the thing that happened, but explained exactly what the cartoon said. It not being important for the general public does mean it's not a probable fact.
For some reason, a considerable number of people don't seem to think twice about adding sensitive paths to robots.
Relatively common to find sensitive or embarassing links singled out in robots.txt
Especially in old large organizations, like universities.
I wonder if these are some of the same people that Musk brought in to refactor Twitter.
Deleted Comment
I accidentally ordered my model 3 with a free reservation, not the one I actually paid for.
If it's tracked, then ignore has no effect. If it's not tracked, then you might as well use .git/info/excludes which is pretty much the same thing but not tracked, or you can use a global excludes file, like ~/.gitignore is common (you have to configure git to point at it, iirc).
It _could_ make sense to ignore the .gitignore if some other tool is parsing and using that file, but that pattern is...troublesome so I hope not.
> Git ignores .gitignore with .gitignore in .gitignore
(Partly joking)
Everyone uses git for source control, of course you check out a site with git.
All you are telling people with a .gitingore is what is _not_ available.
It means exactly that people can not access them if your site is a checkout, because they aren't there.
There is no guarantee that an exposed .gitignore (or other exposed files, like .htaccess, robots.txt, etc) will be exploitable, but they aid in the discovery process and may help adversaries uncover exploitable vulnerabilities they might have otherwise missed.
At the extreme, I've seen paths of backups of the production database listed in a publicly readable .gitignore, and that database backup was publicly accessible, too.
Most of the time, nothing sensitive is revealed, but defense in depth suggests it's better to not upload files like these to your web server unless they're being used by the webserver (like .htaccess) or crawlers (like robots.txt), and if you do, they ought to not be publicly readable (unless intended, like robots.txt), but even then, you'd want to make sure nothing sensitive is in any file like that which is publicly readable. Even if there's nothing sensitive in them now, there's no guarantee that nothing sensitive will ever be added.
I had to work with a security team in a FAANG for several years. They were so high and mighty with their low sev vulnerabilities, but they never improved security, and refused to acknowledge recommendations from the engineers working on systems that needed to be rearchitected due to a fundamental problems with networking, security boundaries, root of trust, etc. Unsurprisingly, their "automated scanner" failed to catch something a SRE would have spotted in 5 minutes, and the place got owned in a very public and humiliating way.
When I see things like this it brings back memories of that security culture. Frankly I think Infosec is deeply broken and gawking over a wild .gitignore is a perfect example of that.
It is a bit embarrassing because most web servers (and deployment setups) shouldn't be publishing/serving dot files anyway (files with names beginning with dot). But it's not necessarily a problem as long as they have some protection to avoid the _really_ sensitive stuff leaking, it's just kind of funny.
In practice, it means that if the gitignore file is leaked, that there is a substantial risk that they accidentally leak the .git folder someday.
The .git folder indirectly contains downloadable copies of the source-code of the website, which could very likely lead to credentials leak or compromised services.
Your life can depend on Tesla.com services.
Even if you are the pedestrian side.
FTFY. Little of Tesla's software is whatever they're using on the website. That'd be like judging Apple OS software by their website source.
So, not very surprising and probably doesn't really tip anyone towards anything particularly special.
Otherwise, it's not much of a leak.
ie, if the file said to ignore "/site/adminpasswords.txt" then you could go to /site/adminpasswords.txt and reveal admin passwords. this is obviously a simple eli5 explanation but i hope it helps
however, i doubt the tesla.com website is where they keep any important code that relates to actual tesla software like we would see used in cars. that would be like the army having their real code for their software/systems at goarmy.com lol
Deleted Comment
Dead Comment