. . : : E L P i S : : . .

 
 
 
 
P   a   G      S
 index
 - ELPiS articles - 
 news
 useful links
 support the project
 contacts


Q u i c  K     L i n K s

About Author




Finally issue 13 is out
make yourself comfortable
and enjoy reading it.
__________________________ 

Do you have interesting material? 
Do you want to share it?
Contact me immediately, 
and I will publish it. Thank you.
Please send entries to downgrade@meta.ua 
thank you all!



counter  likes
   


Greetings, dear friends!
Once again, the traditional moment has arrived for us to meet again 
on the pages of our magazine, and this is already the fourteenth issue!

With each new issue, I find myself increasingly reminded that time online feels completely different from time in real life.
It seems like just recently we closed the previous issue, just recently the gray January snow lay outside, and then suddenly Easter arrived, tables filled with smoked meats, dyed eggs, the smell of wax, warm bread, and something so homely that no digital noise can fake it.
This year's holiday was truly vibrant, without fuss, without rushing, with that rare feeling that, if only for a few days, people once again remembered what it was like to be together in real life. And right after the holidays, as if on cue, the windy season began.
And not the romantic spring winds they love to write about in poetry, but the kind that crackle the house at night, shake the windows, and in the morning you go outside and see a tree lying uprooted and torn somewhere by the road, as if someone had simply uprooted it.

And in moments like these, for some reason, the old internet comes to mind especially strongly.
Because it, too, once stood firm, with roots intact, and then the winds came.
The internet was once smaller. Much smaller.
So much smaller that if you showed a modern person the internet of the late 1990s or early 2000s, they'd probably think there was nothing there at all.
A few ugly pages, some forums on a gray background, FTP servers with incomprehensible directories, and homemade websites.

By today's standards, it's nothing. But somehow, in that desert, everything was there.
It's hard to explain to those who didn't live there. There was tens, if not hundreds, of times less information, but you could find what you needed almost immediately. A search engine was just a search engine.
You entered a query, and it actually searched.
It didn't sell you the first five pages of ads, didn't push identical SEO garbage, didn't try to guess what you "really wanted to ask," didn't pretend to care about your convenience.
It simply found. Sometimes it was crooked, sometimes strange, sometimes you had to choose your words carefully, but it felt like the web was open before you, not locked with a plastic lock and the inscription "we know what you need best."

The paradox now is that the internet has become almost infinite, and finding something on it has become more difficult than twenty years ago finding a rare file on a university server in another country. Now you have to save links, pass them on almost like photocopies of samizdat were once passed around—from hand to hand, from person to person, because search engines simply don't show these places.

Not because they don't exist.

They exist.
A huge number of them. They are alive. Somewhere, old archives are still active, somewhere, software collections are stored, somewhere, small enthusiast sites exist, somewhere, ancient forums still function.
But for the modern internet, it's as if they don't exist. You can know the exact title of a page, remember the text almost verbatim, but instead of the result you need, a search engine will show you ten identical articles with useless advice.written by neural networks for other neural networks.

The internet shut down.

The same thing happened with torrents. The same thing happened with FTP warez sites. The same thing happened with forums. There used to be a special thrill in searching.
Not because it was difficult, but because searching itself was part of the adventure. You could accidentally stumble upon a server containing old magazines, rare programs, music, book scans, forgotten source codes, collections of texts that someone had been collecting for years. Sometimes it took half an hour to find the file you needed, but that half hour felt real.
You felt like you were really finding something. Now you're given the illusion of instant access to the entire world, but most of the real internet is hidden somewhere behind the curtain.
Forums are a whole different story. Today, it's common to say that forums are no longer necessary. Supposedly they're outdated.
Supposedly, you can ask a neural network anything. Supposedly, it's easier to post a question in the chat and get an instant answer. And technically, yes, you can get an answer. Sometimes even a good one. But the problem is, a forum was never just a place to ask "how to fix a printer" or "where to find a driver." A forum was a place where an environment existed. Context. Memory. Personalities.

On a forum, you saw who was responding. You saw what someone had posted six months ago. You saw how the community had changed. You saw old arguments. You saw inside jokes. You saw people who could tolerate or hate each other for years, but that's what made the space alive. It wasn't a reply machine. It was a digital neighborhood where everyone had their own voice. Now the internet has become vast, but also sterile. There are more people, but there is less communication. It's a strange feeling: half the planet is online at once, and you're alone in it.

You can open any application, and there will be thousands of faces, millions of messages, endless activity, but it all passes through you like white noise. It doesn't linger. It doesn't stay. It doesn't become part of your memory. Before, even a small IRC channel with twenty people could feel like a real place.
You'd log in in the evening and know the same people would be there. Some would argue, some would share news, some would post a new project, some would argue until the night about some trivial matter that was actually more important than most of today's "global discussions." Sometimes these people would meet up later in town. They'd have drinking parties.
Chat rooms. Some would turn out to be completely different from their online persona. Some, on the contrary, were an exact copy of their nickname. But back then, there was no such concrete wall between the virtual and the real. Now, if you log into any modern "thematic" chat, you often see not a community, but a community sham.
The name could be "cybersecurity," or "reverse engineering," or "underground," and inside sit teenagers who discovered the internet the day before yesterday and are convinced that the entire online culture began with memes and toxic trolling. No projects. No knowledge sharing. No internal discipline. Just endless, useless noise.
Attempts to pick on each other, senseless bullying, and the constant attempt to appear more cynical than the next person. And it's not even a matter of age—there have always been young people. The point is that the environment itself used to force a person to grow if they wanted to stay. Now the environment adapts to the lowest bar, because everything is built around maintaining attention, not meaning. Once upon a time, if there was a chat for those very same "kulhackers," you could actually see real things being discussed.

Someone would dismantle a new exploit. Someone would argue about how flawed someone else's code was. Someone would post experimental software. Someone would bring a log from someone else's server, and a long technical debate would ensue. Sometimes people would argue so much that it seemed the channel would fall apart, and two days later they would be working on a new project together. That was the essence of it. Not romanticism in the glossy sense, but a real, raw, real-life experience. The internet was a place where people didn't just consume, but participated. And here, perhaps, is the main problem with the modern internet—not algorithms, not social networks, not corporations. Laziness. A deep, sticky, almost ideological laziness.
People want to recapture the atmosphere of the old internet, but don't want to do anything that creates that atmosphere. They miss forums, but aren't ready to write long messages. They miss websites, but aren't ready to create their own pages. They miss IRC, but aren't ready to sit and chat if the interface isn't sleek and doesn't send notifications every three minutes. Many want the old internet as a ready-made service—preferably in two clicks. But the old internet was never a service.
It was handmade. A factory. It was built by people who weren't too lazy to do something for others, even if it was only seen by twenty people. Information was valuable, important.
For too long over the past decade and a half, people have been taught that the internet is a few platforms, a few apps, and an endless feed where you have to scroll down. You don't have to search for it yourself. You don't have to think for yourself. You don't have to create it yourself. Everything is already prepared.

Just consume it. And if you try to search deeper, the network literally starts to kick you out: “there’s nothing,” “no results,” “you might be interested in this.” Although, in fact, everything is there. And often in huge quantities. It’s just that the real network has long been hidden under layers of digital concrete. Sometimes it seems that the old internet disappeared not because it was destroyed. But because too many voluntarily left it for where it’s simpler. Where you don’t need to build anything. Where you don’t need to think. Where you don’t need to remember.

Where you don’t need to be a part of anything. And that’s precisely why today, in the era of endless connectivity, many feel lonelier online than back when the connection beeped like a modem and every minute cost money.

Precisely because the internet was so expensive - people thought before writing or sending something, so information had value.
And, perhaps, the strangest feeling is to realize that technically the internet has become more advanced in everything, except for that.What it was once needed for most. It became faster, brighter, more convenient, more accessible. But at the same time, it ceased to be a place.

And yet, that's precisely what once made it special. Not the speed. Not the technology. Not the space. But the feeling that there really is someone on the other side of the screen. This is the fourteenth issue of the magazine. Even though the beginning was a bit sad, I hope you have a good time reading. Welcome.


Table of Contents


1. Planning, or how to avoid getting caught in an endless cycle of revisions
2. What about the long term current
3. HTML Map: Antiques from the Digital Baroque Era
4. Digital Ghosts and Software Sadism: The Great History of Easter Eggs
5. Terminal Web Browsers: A Brief Introduction
6. Why does progress breed human stupidity?
7. Epilogue



Planning, or how to avoid getting caught in an endless cycle of revisions


In today's fast-paced world, it's important to be able not only to start but also to finish. Many people face the problem of endlessly improving and refining their projects—whether it's creating a website, writing an article, or any other creative process. This is especially true for young people who are just beginning to master new skills and are faced with a multitude of ideas and inspirations. They swarm in their heads, and their growth is endless. On the one hand, this is wonderful because there is room for perfection, but on the other, a project has the quality of being unfinished, and may remain so. To avoid such pitfalls and learn to finish what you started, let's look at a few simple rules.

A technical specification (TS) and a work plan on paper are key to success. After all, at the start, it's not always clear exactly what the final product will look like, but it's important to have a clear idea of ​​the direction in your head. And although many people today prefer digital notes, a paper plan isn't just an old-fashioned method, but a real tool for preventing "endless rework."
Remember, in the last issue I wrote about doodling? In this case, a work plan (development plan) can be perceived in the same way: you draw arrows over the text, color in important areas, or highlight with a marker. All these things help your brain gather all the information into a coherent whole. While it may seem like a scribble, in your head, this information will fit together into the necessary puzzle. We are analog and can't think like a computer, and when we create a plan on a computer, we'll spend more time thinking about what goes where and which tool to use to arrange a particular idea, rather than actually writing that idea down exactly as we want it to be. Simply put, we won't be able to think about the plan; we'll be thinking about how to "format" it into a document. This format helps us build a picture without overloading ourselves with unnecessary technical issues. We don't think about layout or worry about how to fit every thought onto the page, but simply write as our brain tells us. This gives us freedom. No lines, margins, fonts, or frames distract us from the idea itself, from the essence of the project.

When we start using digital tools, we automatically switch to planning mode in the traditional sense. Instead of thinking about content, we start thinking about how to arrange information on the screen, where to leave margins, and what font to choose. This process outweighs the task itself—we don't focus on what's important, but instead begin anxiously sorting through options for how to "beautifully" present our plan. All this leads to us spending time not on the essence of the work, but on its external aspects.

Why is this so important? Firstly, on paper you can truly feel the process, make a few sketches that you won't be embarrassed to ruin. This is a kind of first step towards completion. Once the plan is "frozen," you can begin implementing it. In other words, when a project isn't limited by the confines of a sheet of paper, it can be endlessly expandable and improved.

How this works for the brain:

An important aspect here is how our brain works when perceiving information. When we draw or write by hand, the parts of the brain responsible for creative information processing are activated. We're not limited by boundaries and rules, as we are when working with digital documents. Sometimes scribbles and random arrows help structure our thoughts and give the brain a chance to organize information.

The brain perceives handwritten planning as something organic, something that can be easily processed and changed. We don't waste time on layout and design—ideas take center stage, and everything else is just a way to visualize them. Arrows, circles, and highlighted phrases—all this helps us not only capture information but also understand how to connect it. It's like creating a true information puzzle. Each piece and fragment in our minds becomes important, not as an element of a "document," but as part of a unified whole.

The advantage of analog methods.

Why is an analog approach so important? It allows us to move away from formalism and focus on the thinking process itself. Unlike computer programs, which require time to master the tools, paper allows you to literally pour out all your ideas at once and without restrictions. This is precisely what allows the brain to work in a freer, more creative environment.

When a plan is developed by hand, it becomes more than just a jumble of words on a page. It's more of a living process that evolves as new ideas arise. You don't try to arrange everything or format it correctly right away. You simply write, draw, and underline. And over time, you begin to see the whole picture. A clear image of what the final work should look like emerges in your mind.

The problem of overstructuring

When working on a computer, everything becomes more mechanical. Using a word processor or other digital tools, we automatically fall into structuring mode, where thoughts must be organized according to templates and styles selected in the program. This distracts us from the planning process itself and complicates the task. Instead of freely developing our thoughts, we begin to formalize them, which hinders our thinking. We become distracted by "technical details" that, ultimately, have nothing to do with the essence of the task.

Digital tools are great when the work is already in progress, when the content is ready and only needs to be organized according to specific requirements. But at the start, especially when the plan is still in its infancy, manual planning gives us much more freedom for thoughts, ideas, and decisions.

A perfectly blank slate isn't freedom, but an overload of possibilities.

When you have a completely empty space in front of you (like a new document in an editor), you have no support: no scale, no direction, not even a hint of where "up" and "down" are in semantic terms. And this leads to paralysis. A perfectly blank sheet of paper is a trap. It demands everything from you at once: structure, idea, form, meaning. You look at it and realize you could write anything—and that's precisely why you write nothing. A blank document in a computer editor is even worse. It's not just blank, it's also technologically pristine: even margins, perfect font, a blinking cursor that seems to be counting down the seconds until your disgrace. This cursor behaves suspiciously—blinking as if to say, "Well? I'm waiting. Go ahead. Surprise me." And you suddenly begin to feel that every first word is a mistake. Because from such a start, you can't write "well, anyway" or "in general." What's required is "In an era of rapid development..." at the very least.

Now take a scrap of paper. Small, crooked, with some old phone number or a drawing on the back. And suddenly everything becomes simpler. You don't think about a grand plan—you think, "A couple of lines will fit here, and that's it." There's already imperfection there, so you have the right to be imperfect too. There's already some nonsense there, so you're not the first, and that takes away half the pathos. A scrap of paper doesn't expect a masterpiece from you.

The result is a strange paradox: to start writing properly, you don't need a perfect space, but a slightly flawed one. Because a perfect blank sheet requires a perfect you.

People often forget that execution isn't always perfection. The main focus at the beginning of work should be on content, and only then can you think about design and beauty. It works, and it's time-tested. Only those who understand the importance of content will be able to cultivate something great and lasting, and not just create a "picture." Content first, form second.

Everyone knows the feeling of explaining things to people, giving them clear recommendations, and yet they still leave confused. Sometimes it's so exhausting! People, especially teenagers, often come with zero knowledge, and it seems like you're teaching them, but they still do it their own way.

But this is where patience and the understanding that not everyone is immediately ready for this path are crucial. It often happens that after several attempts, after lengthy explanations, you finally get the gist of it across. And maybe not everyone will stick around, but if even five out of a thousand people understand your idea and move on, that's already a great result.

When creating content, you're faced with a situation where you don't just need to fill a page, but make the information useful and relevant to those who are truly interested. It doesn't matter how many people see your work; what matters is how many of them stay.

For some reason, many people start not with the work itself, but with an imaginary tribunal. Nothing's written, posted, or finished yet—and the jury's already racing through your head: "Nobody needs this," "Nobody will understand this," "This has already been done better." And the judges are incredibly strict and, crucially, completely fictitious. They haven't read your text, haven't seen your code, but they've already delivered their verdict. The result is a funny situation: you've forbidden yourself a result that doesn't even exist yet.

It's even funnier when predicting other people's thoughts kicks in. Not only hasn't a person started—they already know what "people will think." Which people? When? Where? It's not specified. But the certainty is rock-solid: "It won't work." Ultimately, half the ideas die at the "Well, it's probably stupid" stage, and the other half at the "I've almost finished it, but what if someone doesn't like it" stage. It's as if the only acceptable outcome is instantaneous delight for everyone around you; otherwise, it's not even worth trying.

The problem is that this approach guarantees only one result—no result. You can't tell whether it will work or not until others see it. You can't get feedback if you've already forbidden it. And you certainly can't learn to do better if all the best stuff is left in drafts "just in case." Mistakes, strange reactions, silence—these aren't signs of failure; they're simply part of the process you're trying to navigate without moving forward.

A whole other genre is the "here's a link, you can figure it out yourself" genre. Someone posts a link and expects everyone to rush to it, as if they've just opened a portal to another world. In practice, this doesn't work very well. A link without context is just an address. It has no reason to be opened. No one is obligated to guess what's inside or why it should be interesting at all.

People respond to interest, not links. If you haven't explained why it's worth checking out, why should anyone else? A few words, a short description, some kind of hook—and there's a chance someone will stop and think, "Okay, I'll take a look." Without that, it feels like, "Here you go, I made something, but I won't tell you, maybe you'll figure it out yourself."

Ultimately, it all comes down to a pretty simple thing: don't try to predict other people's reactions beforehand. They'll handle that perfectly well once they see the result. Your job is to get the result to the point where it's "showable." Then you can discuss, improve, and rework it—but at least there will be something to discuss, not just another perfect idea that no one has ever seen.

Old ideas for a new time.

Today, more and more people are beginning to return to the ideas of the past. Web 1.0 with its simple and straightforward interfaces, where minimalism wasn't just a trend, but a genuine necessity. There's no point in collecting old CDs or copying other people's ideas when you can create something new, but with respect to old principles.

Pure minimalism isn't just a trend, but a way to create things that will work even if technology and design change. Web 1.0 wasn't about complex interfaces and tons of dynamic effects, but about simplicity and functionality. This spirit can and should be revived, but at a new level, taking into account modern requirements and capabilities.

It's important to take steps.

Content creation, web development, design, or training don't yield instant results. It's important to have a plan and stick to it. Content and design aren't just elements you can throw together and forget about, but carefully crafted parts of a project. If a project lacks purpose and isn't connected to the right people, nothing will come of it.

But if you can ensure your content is useful to those who understand its value, then that's already a success. Even if only 5 out of 1,000 people remain, they'll be the ones who continue to develop, code, and design. And you'll be proud that you were right.

Don't be afraid to share your knowledge, even if your experience seems small or insignificant. After all, it's this experience that makes us those who help others move forward.



What about the long term current


I think about death perhaps more than most twenty-somethings. My own, yes (but that’s beside the point and beyond this essay’s scope) but also that of my websites. Why? I don’t know.

Yet every time a domain renewal notification arrives in my inbox, it marks something. Ensuring they shall be mine to experiment with for another year...there’s something tender about it. Almost ritualistic, if you’ll allow me the metaphor. I take them wherever and whenever I can get them.

Many things went through my head when I started my first site two years ago, but never this: when you create one, you’re making a promise. This’ll be here tomorrow. And the day after that.

But will it? I’m not sure anyone can answer that honestly.

The internet has a memory problem. Not in its infrastructure or costs, but in its custodianship. Someone has to care enough to keep paying the registrar, renewing the hosting, patching the underlying software lest your poetry blog turns into a horrendous scam.

Do most of us think about this? I’m not sure. We assume sites will persist, like books do. But books don’t need electricity. They don’t vanish when a company does.

GeoCities disappeared, and millions of pages went with it, only because the business in charge decided it wasn’t profitable. And it’s not an exception. It’s another headstone in the graveyard.

My teenage self and countless others poured themselves into services that went defunct. The family man who loses all his photos when he transfers accounts. The forums...all those connections and discoveries and teachings...they aren’t accessible now.

Every site on the World Wide Web is a relationship between someone and a computer. When the person leaves, the machine forgets.

So what do we do? I wrestle with two apparently conflicting impulses here.

The first is grief. It’s tempting to accept impermanence as the price of admission. To mourn and move on.

The second is indignation. I refuse to accept it’s the only option.

The IndieWeb community has been thinking about this for years. Their core argument: own your site, own your data. Share what you make elsewhere if you want, but keep the original version on ground you control. They call this POSSE: Publish on your Own Site, Syndicate Elsewhere. It’s not a catchy acronym for its own sake. It’s a survival strategy

But even this has limits.

I rent my domains. I rent my hosting. I back up my files. What happens when I can’t? If I lose interest, or funds, or the entire plot? We write wills for our assets. Who inherits your domain name?

The Wayback Machine does extraordinary work. The Internet Archive is, perhaps, the single most important cultural institution of the twenty-first century. I’m not exaggerating, I swear. But it’s also one organization and one set of servers. Shouldn’t that terrify us?

And this is where it gets uncomfortable.

Not everything ‘deserves’ to be preserved forever. I know that. We all do. As much as I adore the internet, it produces more in a day than entire centuries did. Most of it’s noise. Some of it’s worse.

Things like the Great Gatsby and the Arthurian legends endure because every generation since has decided they’re worth the effort. Preservation as verdict, if you will.

So when I mourn the loss of a site...what am I mourning, exactly? The creations within? Or the act of making it? There’s a difference, and perhaps I’ve been conflating them.

Someone’s fansite from 2001 isn’t Shakespeare, sure. Perhaps it doesn't need to survive intact for five hundred years. But the impulse behind it: someone keen, knowledgeable, and generous enough to share that with strangers is the thing worth keeping.

The question is whether you can separate the two.

I’m not sure you can. And perhaps that’s why this problem has no elegant solution: because we're not preserving _pages_. We're preserving evidence that someone cared.

While I can’t decide what’s worth preserving, I can tell you what you can do to maintain your creations. Because if you put in the time and effort, they’re worth it to you, so they’re worth it to me.

Build small, and in formats a browser from 2045 could still render. There’s a reason the small web fascinates me. A HTML page is, structurally, one of the most durable digital objects one can create.

Keep things in more than one place. Print them if you can bear it. Give copies to friends. Email your posts to someone who cares. Let your works exist in formats that don’t need a power outlet.

And please, talk to the folks you trust about your sites. Tell them where the keys are.

Because, despite everything we might attempt to prevent it, impermanence is woven into everything. Even the internet. Nothing is guaranteed.

And yet…when you think about it: the creations outlast the container. Always.

I think about the websites I visited as a kid. I can’t prove they existed. But those sites taught me strangers could build homes out of code and leave the door open for anyone who wandered in. The ideas I encountered on those pages live in how I think, what I make, what I care about. They shaped me in ways no archive could capture, because their influence moved from pixels into a person.

That’s what art does when it’s honest. It escapes the medium. A recipe shared on a food blog becomes a meal someone cooks for their family for decades. An essay about loneliness reaches a teenager at the moment they need it, and they carry the feeling forward into every conversation they’ll ever have. A tutorial teaches a skill that gets taught again and again.

The words, the ideas, the pictures...these things migrate. It’s the web as oral tradition, handed not through stories around a fire but through pixels on a screen encountered at the right moment.

Perhaps that’s the long term we should think about. Not preserving every page forever, but creating things worth carrying forward. The urge to make something and share it.

As long as someone, somewhere, is hand-coding a page about their cat, or their favorite movies, or the way a café worker writes their name on their morning beverage...the thing that matters is alive.

Not the page itself, but the urge. The sites will come and go. The impulse endures.


Zachary Kai




HTML Map: Antiques from the Digital Baroque Era


If you started working on the front end after 2010, you've likely only seen this tag in reference books, alongside <marquee> and <blink> tags. But there was a time when <map> was considered the pinnacle of technological advancement.

Historical Background: The Birth of a Legend

The <map> tag was officially incorporated into the HTML 3.2 specification in 1997. Why? To solve the main problem of the time: "How can a single image be multifunctional?"

In the 90s, the internet was slow. Every single image button was an extra HTTP request, making your modem emit the sounds of a dying cyborg for an extra 10 seconds. The solution? "Image Maps." You download a single, heavy file (like an office map or a starship dashboard) and, with a flick of the wrist, transform it into a scattering of links.

Browsers of the time—Netscape Navigator and Internet Explorer 3.0—treated this as magic. It was the first attempt to make the web truly interactive without the heavy lifting of Flash or Java applets.

The Age of Pathos: How Brands Had Fun with <map>

In the late 90s and early 2000s, having a simple text menu on a website was considered a sign of poverty and lack of imagination. If you were a major brand, you were obliged to force users to "explore" your site.

1. Apple and their products: Long before minimalism, Apple's website featured detailed images of computers. Clicking on the monitor took you to the screen specifications section. Clicking on the mouse took you to the accessories section. It looked incredibly expensive.

2. Automotive giants (Ferrari, Mercedes): They loved this tag. You saw the car's interior in front of you. Want to know about the steering wheel? Click on it in the photo. Leather seats? Click on them. This created the illusion of presence, although in reality it was just a picture with invisible "holes"—links.

3. Gaming Websites: Websites for games like Diablo and Fallout used <map> to transform the homepage into a game interface. The site's inventory menu was a clickable image map.

Back then, visiting the website of your favorite game or TV show was like starting a quest. Instead of a boring white background with titles, you were greeted by a Splash Page—a huge, bombastic image that took up the entire screen and literally screamed, "You're not welcome here until you find the entrance!" This was the golden age of the <map> tag, when designers at Blizzard, Westwood Studios, and Star Wars fans competed to see who could hide navigation best.

Imagine a typical Diablo or Warcraft website: instead of a list of "News" or "Forum" sections, you'd see a meticulously rendered altar in a Gothic cathedral or an orcish hut. To get to the "Characters" section, you'd click on a rusty sword leaning against the wall, and the "Guestbook" was hidden behind a dusty tome on a shelf. An image <map> linked these pixelated objects to real-world links. If you missed the skull that led to the "Bestiary" by a few millimeters, you'd simply poke into nothingness, cursing the designer but feeling incredibly immersed.

Fansites for TV series, such as The X-Files, took this to extremes. The homepage could look like Agent Mulder's desktop: scattered photos (links to the gallery), an FBI badge (about the authors), and a smoldering cigarette in an ashtray (the conspiracy theory section). All of this relied on a single <map> tag with a dozen <area shape="poly"> areas. It was pure visual flourish: the site didn't just provide information; it forced you to "live" through its interface, turning ordinary browsing into exploration.

It was a time when design took precedence over user experience (UX). Users had to move the mouse around the screen like a mine detector, hoping the cursor would change to a "finger" to indicate an active area.

Theory: How this magic was built

Technically, <map> is an invisible layer of coordinates on top of the image.

<img src="cool-image.jpg" usemap="#power-map">

<map name="power-map">
  <area shape="rect" coords="34,44,270,350" href="catalog.html" alt="Section 1">
  <area shape="circle" coords="450,120,60" href="contacts.html" alt="Button">
  <area shape="poly" coords="10,10,50,20,40,80,5,70" href="secret.html" alt="Complex figure">
</map>

The layout designers of the time were masters of geometry. Imagine opening Photoshop, selecting the "Info" tool, and manually entering the coordinates of every pixel for a complex polygon. A single digit error, and your link would drift off. It was meditative, but hellish work.

The Fall of an Empire: why did we abandon it?

1. Responsive Design: This is the tag's biggest killer. Coordinates in <map> are specified in fixed pixels. As soon as the image is compressed on a phone screen, the coordinates remain the same. As a result, the "button" that was on the steering wheel of a car ends up somewhere near the windshield.

2. SEO and accessibility: Search engines had trouble understanding the structure of such links, and screen readers for the visually impaired often went crazy trying to explain what this "polygon" was.

3. Retina displays: Pixel density made things even more complicated.

Modern use: Sarcastic "Is he alive?"

Surprisingly, <map> is still supported by all browsers. Why? For backward compatibility. The world would collapse if the Ohio State Library website, built in 1998, stopped working.

Today, <map> is sometimes used in:

Complex infographics where you need to quickly mark up areas, and you're too lazy to mess around with SVG (although that's a sin).

Emails (Email newsletters). Email clients are a tech ghetto stuck in 2004. There, <map> sometimes works better than complex layout.

In the mid-90s, the browser wars were in full swing, and support for the <map> tag became one of the battlegrounds where the internet was decided. Before the advent of client-side image maps (which we now know as <map>), only server-side maps existed. It was pure hell: you clicked on an image, the browser sent the click coordinates to the server, the server frantically tried to figure out where you were in its database, and only then redirected you. It was slow, clunky, and loaded up the network.

Spyglass Mosaic pioneered the client-side approach, and then Netscape Navigator 2.0 picked up the slack. It was Netscape that made <map> mainstream by allowing the browser to understand where the user clicked, instantly highlighting the link in the status bar. It was a technological marvel: the page came to life right under the cursor, without waiting for a response from the server. Microsoft's Internet Explorer, as was often the case in those years, jumped on the bandwagon a little later, fully implementing support in version 3.0. Until then, developers had to contrive a double-edged sword, writing dual logic—for both advanced Netscape users and those laggards on older versions.

Funnily enough, older browsers that didn't understand <map> simply ignored the tag. To them, it was just invisible junk in the code, and the image remained just an ordinary image, unclickable. To avoid leaving users stumped, the designers of the time created "text footers"—a list of links beneath the image, which they called "the poor man's menu" or "the menu for text-only browsers" (like Lynx). This was considered good form: even if your visitor had an old browser or disabled images to save bandwidth, they still had to somehow navigate through your pretentious Star Wars fan site.

Server-Side Maps (ISMAP): When Your Click Flew Across the Ocean for Advice

Before the <map> tag brought a semblance of intelligence to our browsers, Server-Side Image Maps ruled the world. They were the pinnacle of hacky programming. Imagine: you want to click the "Login" button on an image. In today's world, the browser automatically figures out that you've hit a square. In the ISMAP era, the browser was as dumb as a brick. It simply told the server, "Hey, boss, there's a leather bag poking at coordinates X=105, Y=203. What should I do?"

The syntax for this mess looked something like this:

<a href="/cgi-bin/cms/cgi-bin/main-map.map">
  <img src="huge-navigation-panel.gif" ismap alt="Good luck guessing where to press">
</a>

Notice that humble ismap attribute. It turned the image into a spy. As soon as you clicked it, the browser generated a request like main-map.map?105,203.

But the funniest part was happening on the server. There was a special text file (usually with the .map extension) that looked like a cheat sheet for someone with a sclerotic heart:

default /home.html
rect /news.html 0,0 50,50
circle /contacts.html 100,100 30
poly /secret.html 10,10 20,40 50,10

What was the sarcastic part of this situation?

1. Speed: You click, wait for the request to reach the server in California, the server unwraps its CGI script, reads the text file, compares the numbers, and sends you the command "Go to news.html." In the era of 14.4 kbps modems, this process took forever. You could make a cup of tea while the server decided whether you hit the right button or missed it by a pixel.

2. No feedback: The cursor didn't change to a "palm" shape. You moved the mouse blindly across the image. It was a lottery: "I wonder if this pixelated blob is a link or just dirt on the monitor?"

3. Load: Every sneeze from the user forced the server to work. A thousand people are simultaneously moving their mouse around a map? Congratulations, your server crashed from trying to calculate the area of ​​a circle.

Software for "cartographers": Tools from the Chamber of Weights and Measures

Only saints or people with very poor eyesight and plenty of free time could create such maps by hand, writing down the coordinates of every pixel in a notebook. For everyone else, there was specialized software.

The king of the party was MapEdit. It was a tiny utility that allowed you to open an image and literally "draw" the desired locations with your mouse. The program itself generated this endless list of numbers for <area coords="...">. If you had MapEdit, you were considered elite—you weren't just "writing code," you were "designing the interface." It was like working in Paint, but the output was magical HTML.

Later, the heavyweights arrived. Adobe ImageReady (Photoshop's little brother, bundled with it until CS2) made creating maps an industrial process. It had a "Slices" tool, but for those who really wanted a map, there were "Image Map Tools." You'd draw circles over Arnold Schwarzenegger's face on a Terminator fan site, enter the link into the properties window, and voila—ImageReady would spit out a ready-made piece of HTML code, which all you had to do was paste into your index.html.

There was also Macromedia Dreamweaver—a dream for any visual developer. It let you simply lay out zones right in the preview window. It looked so cool and technologically advanced that the layout designers of the time felt like cameramen from The Matrix, although in reality they were simply placing invisible links over a 256-color GIF image.

Epilogue

The <map> tag is like an old vinyl record player in the world of streaming. It's awkward, cumbersome, and requires manual configuration, but it has its own charm from the "golden age" of the web. It reminds us of a time when we weren't afraid to experiment and made users click on every pixel in search of Easter eggs.

Today, SVG (Scalable Vector Graphics) has taken its place. They do the same thing, but are scalable, styleable with CSS, and don't break on iPhones. But remember: every time you draw a <path> in SVG, somewhere in the world there's a little <area> tag pinched, longing to be part of Ferrari's pompous website again.




Digital Ghosts and Software Sadism:
The Great History of Easter Eggs


An Easter Egg is a piece of code that serves no useful purpose other than satisfying the programmer's ego and entertaining a bored user. It's like finding a stash in an old jacket, only instead of money, it's a dancing dinosaur or a message telling you that you wasted your life scrolling through your feed.

Why is this funny? Because the IT world is a sterile, cold desert of ones and zeros. Easter eggs are graffiti on the walls of this desert. They're a way of saying, "Hey, there was a real person here, and it was important to them to mark the code like a dog with a post."

Background: Revenge as an Engine of Progress The first Easter egg was born not from a good life, but from corporate slavery. In 1979, Atari banned programmers from listing their names in game credits (to prevent competitors from poaching them). Warren Robinett, creator of Adventure, decided this was unfair. He quietly built a secret room into the game, complete with the inscription "Created by Warren Robinett."

Atari managers found out about this after the cartridges had already gone to print. They were furious, but one of their bosses wisely remarked, "It's like an Easter egg hunt! Let's just say we planned it that way." Thus was the name born, and thus corporate greed was shamed by a digital guerrilla. Steve

Jobs and His "Holy" Secrets.

Steve Jobs was a perfectionist with tyrannical tendencies, but he also liked to wink at those with an opinion.

A Toast to the Madmen: If you went into the settings in older versions of macOS and selected certain fonts, you could find the full text of the famous "Here's to the crazy ones..." speech.
Windows Icon: Apple has always loved to poke fun at its competitors. For a long time, the Windows computer icon on the Apple network was an old, pot-bellied monitor with a "blue screen of death." Subtle, cruel, classic.
Bill Gates' Book: In some versions of the Apple interface, you could make out the text of a letter when viewing the "TextEdit" icon. This was an actual letter from Jobs to employees, but the humor was that the "Documents" folder icon sometimes featured a silhouette that looked suspiciously like Bill Gates.
Wozniak in Code: The very first Apple I and II computers had the creators' signatures baked into the ROM. Steve Wozniak loved hiding mathematical jokes there that only three and a half people in Palo Alto could understand.

Digital Sabbath, or websites that are closed on weekends and holidays. This is the ultimate form of Easter trolling—sites that take their holidays with people. It's not a bug, it's a design decision that hits the spoiled consumer in the face.

B&H Photo Video: The most famous example. This is the largest photography store in New York City, owned by Orthodox Jews. Every Saturday (Shabbat) and during Jewish holidays, their website literally turns into a pumpkin. You can log in and browse the products, but the "Buy" button disappears. You're politely told, "Come back tomorrow; today we're talking to God, not your credit cards."
"Government-Serving" Websites: In the early 2010s, there were small government portals in Europe that literally "closed" at night. At 8:00 PM, the message "We've gone home. The site will reopen at 8:00 AM. Go to bed." This is the best dark humor in the history of the web—a reminder that even a soulless server has the right to sleep, unlike you, freelancer ;-)

Google, the evil and fun corporation, is Disneyland for those who love to dig through search. They have Easter eggs for every occasion, including the end of the world.

Thanos Snap: In honor of The Avengers, Google introduced an Easter egg with Thanos's gauntlet. When you clicked it, half the search results simply crumbled to dust. Ironic, considering that Google can literally erase your business from search results with one click.
Zerg Rush: Search turned into a game where lowercase "o" letters ate up the results. If you didn't shoot them in time, that's it, your query was destroyed. It's a metaphor for how quickly information on the internet turns into garbage.
The Last of Us: Type the name of the series into the search engine, and a mushroom will appear at the bottom. Click on it, and the screen will start to grow cordyceps. A wonderful reminder that we are all just a breeding ground for fungi and algorithms.

Dark Humor and "Grim" Easter Eggs.
Some developers have a sense of humor on the level of "a medical examiner on vacation."

Sadist Console: On many professional coding websites, if you open the console (F12), it will say: "What are you looking for here? Your personal life? It's not here, get back to work."
Discord had a rare Easter egg: when launching the program, instead of the usual logo, "Wumpus" (their mascot) would appear, but in a very strange, sometimes creepy context. And their update list often contains items like: "Fixed a bug where your cat could summon a demon through chat."

404 Error as Art.

Quite often, on some funeral home websites, the 404 (not found) page contains the text: "It seems you are looking for something that no longer exists. Just like your hopes for eternal life." Harsh, offensive, but very true.
The Useless Web: This is a whole portal-like Easter egg. Clicking a button takes you to websites that do NOTHING. For example, a site where you have to endlessly feed a virtual horse an endless supply of cucumbers. It's the darkest irony of how we spend our screen time.

Why do we love them?

Easter eggs are a protest against seriousness. When a bank website suddenly starts playing snake, or when Amazon hides a greeting for aliens in its code, we understand that the world hasn't been completely taken over by robots yet.

Hidden meanings give us a sense of superiority. "I know how to turn Google upside down, but you don't." In an age when neural networks write code in seconds, Easter eggs remain the last bastion of human absurdity. Or do they?




Terminal Web Browsers: A Brief Introduction


Browsing the internet from a command-line interface (CLI) can seem like a throwback to a bygone era - maybe a time when the World Wide Web was in its infancy and something you might access on a UNIX system in a college campus. Certainly, it's not something most people would consider a legitimate daily modern tool. After all, why would you want to browse the web on something that can't display half of what the internet has to offer?

But browsing via a terminal is not only still possible, but in some instances, it's preferable. I still use a CLI browser daily. Moreover, I can't yet imagine a future where it ceases to be part of my workflow.

When it comes to terminal-based browsers, there are actually several to choose from. I use [Lynx](https://lynx.invisible-island.net/), but there's also [w3m](https://github.com/acg/w3m), [eLinks](http://elinks.or.cz/) and a few others. Of course, it comes down to personal preference as to which you choose. However, Lynx always gets my vote, if only because it can access alternative web protocols such as the Gopher net.

Aside from a few quirks, most terminal browsers have the same capabilities (and limitations). They can render basic HTML and that's about it. Browse online with one of these, and you'll find yourself looking at a web that has no CSS, no images, and no scripts. Moreover, some pages appear almost unreadable and others simply don't work at all. In my experience, even Wikipedia looks a bit messy in a terminal browser.

Sounds restrictive, right? However, it's these very limitations that make these browsers so useful at times.

For one thing, terminal-based web browsers are fast - blazingly fast - precisely because they don't load anything but text. So if you want to get a quick answer to a burning question, forget about booting up Chrome or Firefox. Instead, pop a search query into the terminal of your choice (perhaps with a little bash scripting magic to speed things up even more), and chances are, you'll get what you're looking for not only faster, but using far fewer resources.

Perhaps even more enticingly than that, you won't be slowed down or irritated by some of the web's more unsavoury aspects while using a terminal browser. Because terminal browsers don't run scripts, web pages can't bombard you with ads, prompt you to prove you're not a robot, force you to download some obnoxiously massive hi-res image, or - at least in Lynx's case - ask you to accept cookies. And as for paywalls? Well, not to condone or condemn anything, but terminal browsers will often just outright ignore them and load up what you're looking for anyway.

More than anything, though, there's something to be said for visiting a website and only getting exactly what you've requested without any unwanted add-ons.

So, if you're curious, give a terminal browser a go sometime. If nothing else, you'll be left with a different way of looking at the web. And if you're in the business of web design or thinking of creating your own website, maybe consider what your web pages will look like in a terminal browser interface. After all, not only do they still exist, but they're still awesome.





Why does progress breed human stupidity?



At the beginning of the 21st century, when most of us were already carrying smartphones in our pockets, capable of displaying not only cat videos but also live video from the ISS, a Mars rover, and photos of distant galaxies, it suddenly became apparent that the number of people on social media, instant messaging apps, and forums convinced that our planet is not a sphere at all, but a flat disk, had increased dramatically. It was as if progress, which should have destroyed myths, was instead fueling them. And so we see a strange phenomenon: humanity, capable of launching satellites and creating quantum computers, yet massively reverting to the idea that the Earth is a platter without edges.

I wondered: where did this constant movement of the flat Earth come from in an era of scientific triumph? Why do people with access to Google, smartphones, and wireless networks return to 19th-century arguments like, "I'm standing on the shore and I don't see any curvature, so everything is flat." Let's take this journey together, as journalists, with a touch of sarcasm and no offense intended—mere observers.

Let's start with the backstory. In the mid-19th century, a certain Samuel Birley Rowbotham appeared in Great Britain, who in 1849 published a treatise entitled "Zetetic Astronomy: Earth Not a Globe." He drew naive and sometimes absurd conclusions: if a long ditch or canal appears straight, then there is no curvature, meaning the Earth is flat. He even conducted an experiment on the Bedford Level canal. His ideas developed further, and in 1956, the International Flat Earth Research Society (or "Flat Earth Society") was founded in Great Britain. It would seem that it's the 19th century, industrialization, railways, the telegraph—and yet there are still people who say, "No, no, the Earth as a sphere is a masquerade."

Why? Perhaps part of the answer lies in the fact that the "flat Earth" idea engenders a sense of power: "I know something others don't," "everyone's in on the conspiracy," "I'm awake." And in the internet age, this feeling is heightened. According to the Encyclopedia Britannica, "the idea that the Earth is flat has proven resilient even in modern times because people find it easier to perceive flatness, the horizon appears straight, and social media technologies have fueled the movement."

So here we have a person with a smartphone, but with a 16th-century worldview. What's going on?

In our morning hours of coffee and news, we habitually read about new satellites, missions to Mars, and internet providers delivering signals to the other side of the world in milliseconds. And right next to them are YouTube videos: "Documentary: The Earth is Flat!", "NASA Secret Revealed," "Why Is the Horizon Always Straight If the Earth Were a Globe?" And there are likes, comments, subscribers. People who know how to use a smartphone—and at the same time refuse to look beyond their own screens.

If your parents or their parents ever knitted, crocheted, or cross-stitched, think about what a central processing unit or RAM looks like. It's a web of billions of wires, so thin you can't see them without a microscope. And yet, 150 years ago, people didn't even have electricity in their homes. We've figured out a lot. And yet, people with two buttons on their remotes are convinced that the horizon is straight, which means the Earth is flat.

When I asked a friend, "Why do you believe the Earth is flat?" he said, "I've seen a video camera take pictures of the horizon—it doesn't curve. And if it doesn't curve, it's flat." I didn't argue. I just kept silent. Because at that moment I realized: it's not about facts, but about feeling. The feeling that "I see," "I know," "I'm not being lied to." And that it's easier to believe "I know something special" than to ask "why does most of the data say otherwise?"

And then comes the internet. Access to any video, forum, or chat. A huge number of people don't just read—they seek confirmation of their existing beliefs. Social media algorithms push similar arguments, creating a bubble. In such an environment, arguments like "400 km of atmosphere," "satellites," or "photos from space" are perceived not as proof, but as part of a conspiracy: "of course they're hiding it" or "it's fake." In the documentary "Behind the Curve" (2018), researchers examine how leading flat-Earth proponents on forums and YouTube wage campaigns to prove it, and how they themselves fall into the trap of their own thinking.

One wonders: who are these "they" who are hiding it? And for what purpose? If millions of engineers, astronauts, and astronomers are participating in a conspiracy... for what purpose? But such logic often fails: when faith is not a matter of proof, but of identity, proof hinders rather than helps.

The story moves on. Today, there are even studies showing that it's theoretically possible to model ocean circulation assuming a flat Earth—and show that it doesn't match the observed pattern. In other words, science is ready not only to confirm the "flat Earth" but also to demonstrate what the world would be like if it were flat—and the results would be different. However, this doesn't much bother those who have already chosen their side.

The flat Earth is less physics than psychology. When knowledge becomes accessible to everyone, it simultaneously becomes a choice. You can read about satellites. You can read about conspiracies. You can—and—accuse everyone around you of deception. In this sense, the "flat Earth" is the perfect arena: a debate you can't win, but with it you can be a hero in your circle: "I know the truth!"

Let's ask the question: "Why now, in the digital age?" After all, in the Middle Ages people didn't have lenses, microscopes, or telescopes—and yet many knew the Earth was a sphere. And now we have satellites, photography, travel, and unlimited access to information. The answer seems simple: technology doesn't change attitudes. Technology provides the tools. But attitudes are chosen by people. And people want simplicity. A plane is simpler. A circle is complex. The curvature of the horizon? You can't notice it on the ground. The altitude of an airplane? You need to understand physics. But "I see a straight horizon"—that's easy.

It's like when a grandmother does embroidery, her stitches are neat, her seams are even, her thread is thin. Whereas in a computer, the wires are microns, the transistors are hundreds of thousands per square millimeter. People created this web. But can people assemble the entire world like a flat disk? Of course. But can they launch satellite systems, send a spacecraft, or simulate gravity? Yes. There is no contradiction in technology—but there is in perception. And sometimes perception trumps technology.

It's also important to remember: the flat Earth movement isn't a single, monolithic entity. It's more of a network of beliefs, memes, forums, videos, and podcasts. Some people seriously believe it. Some are just trolling or just for show. And then there are those who fish for audience loyalty. Historically, the movement began with serious attempts to disprove sphericity: Rowbotham, his followers, the Bedford Level experiment. Today, it's conferences, YouTube channels, and social media.

Continuing my conversation: "Didn't you ever feel the spherical arguments were compelling?" He shrugged: "I'm not interested. I don't trust them." And here comes the second part: mistrust. Of science, of the authorities, of the "system." In the modern world, where "information" often means "recommended video," where "sources" can be anyone, people choose a default trust. Who to trust? Sometimes it's easier to believe nonsense than to verify. And so, the "flat Earth" becomes a symbol of resistance: "I'm not like everyone else, I see."

And everything wouldn't be so alarming if not for one "but." We talk about a planet we all live on, a world connected by the internet, where billions of devices connect us to data. Meanwhile, in this world, we have an idea that directly contradicts the facts. This isn't just a misconception—it's a cultural marker. It says: technology works, but thinking hasn't been updated. We're exploring space, and yet not everyone perceives it as real.

It's as if a chef knew the formula for an omelet soufflé, but at home he made mashed potatoes and convinced everyone that French cuisine was a myth.

And then, what next? Will there be technologies that will break the flat Earth meme? Perhaps. But most likely not. Because it's not just about data, but about the choices humanity makes. And choices are rarely wise. Maybe in the future there will be even more cool theories, and everyone will have a phone, likes, followers, convinced they're special. But

for now, we live on a planet spinning on its axis, flying around the Sun, between billions of stars—someone is still clinging to the edge.

You know, I sometimes think: maybe the "flat Earth" movement is a test of our maturity. Not technological maturity (which we've long since passed), but maturity of thought. Because you can have a smartphone, you can have fast internet, you can even have satellites—but still lack the habit of thinking deeper, analyzing, and fact-checking. People have forgotten how to fact-check. In such a vast sea of ​​misinformation, it's hard for a poorly educated, erudite person to know where to start looking for specific facts. But we're talking about the simplest things, the kind they teach in middle school...

So if you ever see another video like "10 Facts That the Earth is Flat," remember: technology works, results are achieved, but thinking sometimes stands still. And we, reading this, can smile, nod, and... try hard to avoid becoming part of the group that, with a phone, insists that a disk can't be a sphere. Check your facts.



Epilogue


And so, issue #14 has come to an end. The pages have been flipped, and your thoughts—perhaps they've caught something here and there, or just passed it by. But that's the point: not everything has to stay, just something that's "that one."

ELPiS isn't about quantity. It's always been about trying to preserve a feeling. To capture a moment. To leave a small trace in a space that too quickly erases everything. And also to entertain in your free time.

If you've read this far, it's been worth it.
That means we're up to the task ;-)
As always, I welcome your thoughts, criticism, and ideas in the guestbook.

Until the next issue.




. . : : z i n e : : . .
 
 
           Created © 2023-2026 for HamsterCMS
           Site works on a rocket fuel