I was able to browse google and youtube source code in the very very early days. Was only patched when I called up a friend and let him know. And I tried to submit the flaw through normal channels of a supportless technology company but you can guess how well that went...
What exactly do you think you saw? Bard is not trained on any data of that nature, unless it is already publicly available.
Off the top of my head, I can think for at least five foundation models (Llama, Claude, Gemini, Falcon, Mistral) that are all trading blows, but GPT is still a head above them and has been for a year now. Transformer LLMs are simple enough that, demonstrably, anyone with a million bucks of GPU time can make one, but they can't quite catch up with OpenAI. What's their special sauce?