Late last night, Mr. William Chalmers from Lloyds Banking Group in the United Kingdom sent me quite an unusual email. With the subject “Mutual Investment,” I absolutely expected yet another boring investment scam email which would promptly go into the spam bin. However, upon opening this email I was pleasantly surprised to see that this scammer had not sent me an “investment opportunity” (read that as investment scam) but accidentally emailed me his source code. Oops. I won’t include the full code here, but I will include some interesting snippets.
The code itself, written in Python, is rather interesting, showing that the scammer is attempting to use Gemini Cloud SDK or CloudShare SDK. The Gemini SDK is very interesting, indicating that the intent might be to eventually develop the tool so that it dynamically generates emails using Gemini.
# Assuming we are using the Gemini Cloud SDK (Replace with actual Gemini or CloudShare SDK)
import gemini # Hypothetical import for Gemini SDK (replace with actual import)
import cloudshare # Hypothetical import for CloudShare (replace with actual import)
These first several lines also highlight that the scammer is likely using AI “vibe coding” to produce this code. The comments like “# Hypothetical import for Gemini SDK” are a dead giveaway. An actual developer writing a functional scam tool would use the real library (like google-generativeai or boto3). An AI provides these placeholders when it isn't sure which specific API the user wants to use. Ironically, the scammer likely made a mistake in their prompt or copy-pasting. They probably asked an AI something like, "Write a python script that creates a document in a cloud sharing app with this text and shares it with a user," and then forgot to replace the AI's placeholders.
The next section of the code sets up a “collaborative session” with Gemini or CloudShare. It claims to be using CloudShare, but attempts to call Gemini. Oops.
def start_collaborative_session():
"""
Starts a collaborative session for document editing.
This function assumes you're using a cloud collaboration tool like Gemini or CloudShare.
"""
try:
# Create a new collaborative session in the cloud
session = gemini.create_session("investment_opportunity_session")
print("Cloud Share editor session has started!")
return session
except Exception as e:
print(f"Error starting collaborative session: {e}")
return None
I am impressed that the code uses clean error handling methods, and this might be another sign that the code is AI generated. The code uses try/except blocks and clean if/else logic. It’s structured like a "How to use an SDK" tutorial you’d find in a programming manual or a response from a LLM.
Next the scammer “generates” the message. However, no message is actually AI generated at all, and a static message is used instead.
def generate_investment_message():
"""
Generates the investment message content to be shared via the collaborative editor.
"""
message = """ Greetings To You; I am Mr. William Chalmers, From Lloyds banking Group United Kingdom. I want to invest legitimate funds with you in your country. Get back to me for more details of the investment. Best Regards; Mr. William Chalmers Private Contact: [REMOVED] """
return message
I suspect that the intention here was to have Gemini dynamically generate a message similar to the static message above. However, whatever AI coding tool the scammer is using likely didn’t fully understand the ask and simply created a static message instead.
Next, the scammer defines the function for the actual scam message:
def create_and_share_document(first_name, email, session):
"""
Creates a document with the generated investment message and shares it in the cloud session.
:param first_name: User's first name (used for the document prefix)
:param email: User's email (used if first name is empty)
:param session: The current cloud session object
"""
Now what’s really interesting is that as part of this function the scammer creates a text file containing the scam message, instead of the scam being directly in the body of the email.
# Use first name or email as document prefix
document_prefix = first_name if first_name else email.split('@')[0]
# Generate document name using the user's prefix
document_name = f"{document_prefix}_investment_message.txt"
The document prefix is either the target’s first name, or the first part of their email address. So, if this code worked successfully, the target would receive an empty email with the subject “Mutual Investment,” and a file attached such as “ken_investment_message.txt” containing the investment scam text. This looks kind of suspicious, so the scammer probably then tried to AI vibe code a script to email the contents of the generated file instead of attaching it.
Where did it go wrong? While I don't have the "parent" script, I suspect the scammer used another AI agent to "email the contents of the file." Instead of pointing that agent to the .txt file, he pointed it to the .py source code. The prompt used for this was likely something along the lines of “generate a python script that emails a list of targets the contents of a file,” then pointed the script at his python source code. The result of course is that the scammer accidentally sent out to his intended victims a copy of his source code. For most users this would have just looked like gibberish, and they would have disregarded the email. Unfortunately for the scammer, he happened to target a cybersecurity analyst.
While the scammer has (likely through AI generated vibe coding) try/except blocks everywhere to catch "errors," he committed the biggest error of all: sending the source code to the target. There is no try/except block for "total incompetence." To the scammer who sent this email, if you happen to stumble on this article, please hand in your hoodie and your mechanical keyboard and promptly relocate yourself to the "Phone Scam" department where you can't accidentally copy-paste your own brain to the customer. Or better yet, pursue some legitimate employment opportunities.
Let this be a cautionary tale. One small mistake using code you don’t understand can leak your secrets to the world. In this case, it just cost a scammer his dignity, but in a corporate setting, "vibe coding" without review can destroy a company.
Look for more EMA coverage on the dangers of “vibe coding” in an upcoming episode of the Cybersecurity Awesomeness Podcast.

