My secret to putting my email on my website without getting spammed
I got an email from Ugohow the other day. Ugohar also sent me one. The Ugos send me a lot of emails, they write to me in Russian with links to buy prescription drugs.
Thanks, Ugos.
The Ugos are just a small subset of the spammers I regularly receive contact from. Most of them try to sell me web services like SEO, development, or marketing services. The Ugos and their friends are abusing my contact forms and public email addresses that I have on my websites. I put up with Ugos and friends because sometimes a real human needs to get in contact and I want them to be able to do so in the easiest way possible. Sometimes that’s the whole point of a website, to get leads by encouraging visitors to contact you.
Somedays the spam feels overwhelming, while I looked back at my emails for this article I found that some days I was getting over 10 emails a day that were all landing in my inbox and not in the spam folder. That’s from one website. I built out my personal blog website recently and I was faced with the decision of what to include on my contact page. Contact form or email address or both. My domain name is my name, literally, and I want to at least try to keep the spam down.
Contact forms are the worst
It might feel like having a contact form on your website is the best experience for everyone, but contact forms are kind of the worst. I would prefer to pop an email address into my Gmail and be done with it instead of dealing with contact forms with annoying requirements and dropdowns. Sending from my email app also gives me a record of what I said and when I said it.
On the other end, it’s pretty easy to f›‹¢k up the server-side of a contact form. Not speaking from experience or anything. Configure it wrong, and it ends up in spam or doesn’t get sent at all. Whoops, forgot to include their email address. Thanks for contacting the black void, see you — never. There are a million ways it goes wrong. A more foolproof way is to just copy and paste an email address into your Gmail.
Contact forms are ripe targets for spammers. Low hanging fruit, something they’ve nearly perfected. You don’t want to live through the consequences of not using some sort of bot blocker like reCaptcha on your contact form.
Even though the case against contact forms is strong, some people like the things, and we aim to please on the internet. So, the polite thing to do is to include both a contact form and an actual email address. It feels hard to publicly display your email address, especially when it’s a personal address and not some generic support address. How many times have you seen joe [at] gmAil dot c0m or some variation of strange obscurities? We have good reason to be frightened.
The internet is mostly bots.
So much internet traffic comes from bots that you’d be forgiven if you thought your site was a lot more popular than it is after looking at your server logs. Some are good bots like Google’s search engine crawler. You want that one to ping your website and put the latest content onto Google’s search results. Others are less clear about their motive. Fighting traffic from bots is mostly a lost cause, but ignoring them altogether isn’t a great plan either.
The internet is built on content being crawled and information being disseminated and tangled into the web. Keeping personal information out of that process can be difficult, but not impossible. We need to understand how web scraping works in order to work around it.
Traditional web scraping works like this. A script runs and downloads the contents of a webpage in its raw code form. To see what this looks like, you can right-click on the webpage you’re on and click “view source”. That’s what a web scraper usually takes in. It can then parse out the bits that it wants.
Traditional scrapers don’t load or run any javascript, they only load the server-side generated code. This is one of the reasons that SEO can be so tough for javascript based websites. If your site renders in the browser as opposed to the server, there’s a good chance the content won’t be available when the crawler hits the site. This is a big part of why server-side rendering (SSR) is so popular for javascript based websites. Node-based web crawlers are increasing in popularity because they can load and render javascript but they can be slower are still in the minority.
The Trick
Finally, we arrive at my secret to putting my email address on my website without being spammed. Since most of the crawlers are just grabbing the server-side rendered page, we need to take our email out of the server-rendered DOM and add it back in using javascript after the page loads. This way, users visiting your website will be able to see your email, after a short delay, but when your webpage is scraped, it won’t be readable by the bots since it wasn’t in the DOM when your page was crawled.
To implement this, you could use javascript’s timeout function to set a delay and call a function that adds the email to the DOM.
setTimeout(() => { addEmailToDOM(`your-public@email.com`) }, 1000) // 1000ms or 1 second
This is one of many ways you could go about doing this. To get it working on your website would depend on your setup. This isn’t a foolproof method — almost nothing is. It’s a simple trick that might just help you avoid some spam.
Let me know what you think of this method. Do you think it works?
Originally published at https://joeczubiak.com on January 27, 2021.