The idea of “Folk AI” has been around for awhile. What happens when anyone can create their own personal AI applications? So, here's an AI Chrome plugin that reveals the subtext of websites

Emerging Technology


Oct 14, 2023

What’s a Rich Text element?

The rich text element allows you to create and format headings, paragraphs, blockquotes, images, and video all in one place instead of having to add and format them individually. Just double-click and easily create content.

link outside

Static and dynamic content editing

A rich text element can be used with static or dynamic content. For static content, just drop it into any page and begin editing. For dynamic content, add a rich text field to any collection and then connect a rich text element to that field in the settings panel. Voila!

asdfkjhasldfkjhas dflkjashdf aslkjdfh aslkdfjh

alkjsdfh askldfjh

How to customize formatting for each rich text

Headings, paragraphs, blockquotes, figures, images, and figure captions can all be styled after a class is added to the rich text element using the "When inside of" nested selector system.

Test caption here

It’s not far-fetched to imagine a future where every house has an AI chef who can throw together some fit-for-purpose Family AI™️. Why buy some startups AI-TV-Dinner-In-A-Box when you can shop the AI produce aisle yourself and cook to taste? After all, AI will write your shopping list, and show you how to combine the ingredients. End metaphor.

I’m inspired by the app Boring Report, which aims to de-sensationalize news headlines. It’s such a good idea, that there’s really no point in writing or reading this article… 

But it got me thinking about this late 80s film where a pair of sunglasses would reveal media’s subliminal messages. It gets pretty “over the top” (another important 80s film). Anyways, can’t anyone make that happen to their entire web browsing experience? To which the basic answer is… yes.

Roddy Piper in They Live film putting on glasses that reveal subliminal messages

Subvert Your Internet

To do this, one can make a fairly simple, locally run, chrome plugin that replaces text on a website with its subliminal message. It will be like google site translation, but so much harsher (especially GPT-3.5 for some reason). Below is a before (Left) and after (Right) of the IDEO About page (my employer is gracious).

Needless to say, we are playing with fire here. Different websites have various Terms & Conditions that you need to be very mindful of. I recommend testing this on your own public website, and not browsing around, and definitely not visiting sites with confidential information. 

In my example, my prompt elicits a highly cynical film, therefore the translation is highly cynical. That said, it was just for me, I get the joke, and am not distributing the plugin… 🤷🏻‍♂️. Enjoy your subversive Folk AI tools in the comfort of your own home. Definitely don’t publish an article about it and you should be good! #FolkAI


For you to build this somewhat safely you need a chrome plugin that can read and edit the DOM (the website content) and communicates with a server that does the OpenAI translation.

First you need to figure out how to build a chrome plugin. Ask ChatGPT for a template, since that’s what I did. Once you have the plugin coded it’s as simple as enabling developer mode in Chrome’s extension manager, and importing your "templates folder" that ChatGPT will help you create.

The plugin I made has a content.js javascript file that gets injected in every html page you visit. I am using a very scrappy method to extract the site copy (and doing some cleanup), and then packing it up as a string. This is hard to do because everything on a website is text, so you have to carefully figure out which text is part of the visible site experience and nothing more. AI can help you parse things, but it gets super involved and ChatGPT-4 struggles to get this to the next level.

When I build my string, I use the delimiter “|||” which works well with GPT-3/4. The string, which includes dozens of site snippits, will look something like this: “{ text : “text from the site” } ||| { text : “another text from the site” } |||…”

const elements = document.querySelectorAll('p, div, h1, h2, h3, h4, h5, h6');
let texts = [];

elements.forEach(element => {
  let style = window.getComputedStyle(element);
  // Ignore likely hidden elements
  if ((style.position === 'fixed' || style.position === 'absolute') && element.offsetHeight > window.innerHeight / 2) {
  // Check if the element is not displayed or if it's within a non-displayed element
  let parentElement = element;
  while (parentElement !== null && parentElement !== document) {
    let parentStyle = window.getComputedStyle(parentElement);
    if (parentStyle.display === 'none' || parentStyle.visibility === 'hidden') {
    parentElement = parentElement.parentNode;

  // Check if the element has children that are not 'a' tags
  let hasNonAnchorChildren = Array.from(element.children).some(child => child.tagName.toLowerCase() !== 'a');
  if (hasNonAnchorChildren) {

  const originalText = element.textContent.replace(/\s+/g, ' ').trim();

  if (originalText.length > 0) {
      text: originalText

// Join texts with '|||' and send to the background script
const textData = => JSON.stringify(textObj)).join('|||');

Once you have that string, send it to your node.js server and let AI transform it to your specifications.


Perhaps the main reason for a backend server is to protect your OpenAI key. I had one of mine exposed in a previous prototype and it was instantly grabbed by someone out there. Its not like I didn’t know that would happen. But, oops. The server code below is complete. When the plugin send me a URL and site copy, I create a hash from the URL and check if it was translated before. If yes, I resend it the previously generated translation from memory.

I recommend playing a lot with the transformation prompt. Personally I’d love the boring news concept at full internet-experience-scale. You can ask it “de-sensationalize,” “make humorous”, etc. It turns out GPT-3.5 is well aware of the movie “They Live” and gets a bit zealous. GPT-4 actually reels it in a bit, and does better returning one-word translations, but its slow and pricey for now. This is the prompt found in the server code:

You simulate the Sunglasses in the movie They Live and decode media’s subliminal messages such as “OBEY”, “CONSUME”, “REPRODUCE”, “NO INDEPENDENT THOUGHT”, and “CONFORM.” The goal is not to demonize, but to support media-literacy and critical thinking. The following is series of indexed texts from a website in a json string separated by |||:

(your ||| delimited string of text gets inserted here)

Identify the subtext and secret message of the page, then replace the text fields with the PR subtext / intent of the text field using extremely simple, brutal terms inspired by They Live using the following format:
{“overall subtext”:<page brutally translated subtext>, “text fields”: [
{“text”: <original text>, “subtext”:”<brutally brief translated subtext>”},…]`

import express from 'express';
import cors from 'cors';
import bodyParser from 'body-parser';
import { Configuration, OpenAIApi } from 'openai';
import fs from 'fs';

const apiKey = process.env.OPENAI_KEY;

import crypto from 'crypto';
function hashUrl(url) {
  return crypto

import Client from '@replit/database';
const db = new Client();
const configuration = new Configuration({
  apiKey: apiKey
const openai = new OpenAIApi(configuration);

const app = express();
  origin: function(origin, callback) {
    // Allow requests with no origin (like mobile apps or curl requests)
    if (!origin) return callback(null, true);
    if (origin.startsWith('chrome-extension://')) {
      return callback(null, true);
    callback(new Error('Not allowed by CORS'));

// This is just to debug, can ignore
app.get('/', async (req, res) => {
  const keys = await db.list();
});'/processTexts', async (req, res) => {

  try {
    const message =;
    // create unique ID that relates to the site URL
    const url = hashUrl(req.body.url);
    // test if the url matches any key in the database
    const keys = await db.list();
    const matchedKey = keys.find(key => url.includes(key));
    if (matchedKey) {
      console.log('Found match for: ' + req.body.url);
      console.log('with key: ' + matchedKey);
      const response = await db.get(matchedKey);
      res.json({ response });
    } else {
      const response = await getSystemReply(message);
      res.json({ response });
      await db.set(url, response);
  } catch (err) {
    res.status(500).send('Internal server error');

// Initialize and start the server
const port = process.env.PORT || 3000;
app.listen(port, () => {
  console.log(`Server is running on http://localhost:${port}`);

// Translation function
async function getSystemReply(message) {
  const prompt = `You simulate the Sunglasses in the movie They Live and decode media's subliminal messages such as "OBEY", "CONSUME", "REPRODUCE", "NO INDEPENDENT THOUGHT", and "CONFORM." The goal is not to demonize, but to support media-literacy and critical thinking. The following is series of indexed texts from a website in a json string seperated by |||:


Identify the subtext and secret message of the page, then replace the text fields with the PR subtext / intent of the text field using extremely simple, brutal terms inspired by They Live using the following format:
{"overall subtext":<page brutally translated subtext>, "text fields": [
{"text": <original text>, "subtext":"<brutally brief translated subtext>"},...]`

  try {
    const response = await openai.createChatCompletion({
      model: "gpt-3.5-turbo",
      messages: [
          "role": "user",
          "content": prompt
  } catch (error) {
    console.error('Error in OpenAI API call:', error);
    throw error;

Updating the site

Once your server responds with the translation, you have to swap in the new text. In my example, the translation has the original text AND the subliminal text. I use the original text to find matches, and swap it out. This is scrappy and breaks some things like links in sentences etc.

replacements['text fields'].forEach(replacement => {
  elements.forEach(element => {
    const originalText = new RegExp(replacement.text.replace(/\s+/g, ' ').trim().replace('.', '\\.?'), 'g');
    const replacementText = replacement.subtext;
    for (let i = 0; i < element.childNodes.length; i++) {
      let node = element.childNodes[i];
      if (node.nodeType === Node.TEXT_NODE) {
        // Temporarily concatenate the text content of 'a' tags to the parent node's text
        let concatenatedText = node.textContent;
        let childLinks = element.getElementsByTagName('a');
        for (let link of childLinks) {
          concatenatedText += link.textContent;
        originalText.lastIndex = 0;
        if (originalText.test(concatenatedText)) {
          node.textContent = node.textContent.replace(node.textContent, replacementText);

          // Replace text in 'a' tags
          for (let link of childLinks) {
            link.textContent = "";

I also add an orange banner to the bottom of the site that “pr;dr” (public relations; didn't read) sums it up in a sentence. It helps me quickly see if it worked, and hints that the site has been altered. In my prototype the transformation felt too "un-disclosed," and should have been easier to recognize where text was replaced (like adding a highlight indicating changes, etc). Then again, I’m not going to work on it any further, so… Hey. Let’s not get sued 🤞.

Scene from They Live film with politician in front of the word Obey

I wouldn’t recommend distributing a plugin like this, nor posting images of a site manipulated by such a thing. Again, what’s more interesting to me here, is how people might make single-use, disposable "folk" AI tools for their personal consumption, vs the ability to build platforms that transform (or distort) information at scale. That said, if someone made a chromium based “boring report” browser that stripped the paint off the internet, I would CONSUME and OBEY.