pk.org: Articles

Why Electrical Engineering Seems Backward

Holes, Currents, and Other Quirks of the Field

Paul Krzyzanowski – 2025-09-03

Looking Back

It has been decades since my undergraduate days, but I still remember the peculiarities of my electrical engineering courses. They weren’t difficult because of the math or the physics; they were confusing because the field seemed to cling to conventions that ran against common sense and, at times, against science itself. These quirks might have been harmless in practice, but they left me wondering why electrical engineering stubbornly insisted on being different from the physics and mathematics it relied on, clinging to needless conventions.

A World of Backward Conventions

One of the first contradictions I noticed was between my electromagnetism class and my circuits class. In electromagnetism, we learned that electrons flow from the negative terminal of a battery to the positive terminal. But in circuits, I was told that electricity flows in the opposite direction—from positive to negative. This was “conventional current.”

Electrons and Holes

To reconcile the mismatch, circuits textbooks introduced “holes,” imaginary positive charges that move forward as electrons move backward. But this seemed pointless: we were building an entire fictitious model simply to preserve a sign convention. It wasn’t like Einstein’s cosmological constant, which was added to equations to match observations of the universe. Here, nothing in nature required the invention—only tradition.

Imaginary Numbers—With a Twist

Another irritation cropped up in AC circuit analysis. Physicists and mathematicians use the symbol \(i\) for the square root of \(-1\). But in electrical engineering, \(i\) was already taken for current. To resolve the conflict, engineers adopted \(j\) for imaginary numbers.

The result was harmless but annoying. Moving between math, physics, and engineering courses, I had to constantly translate between \(i\) and \(j\). The underlying concept remained unchanged, but the notation made the learning process more cumbersome than necessary.

Other Engineering Oddities

Electrical engineering had plenty of other conventions that seemed just as arbitrary:

And then there was semiconductor physics, which often felt like it spoke a dialect only loosely related to physics:

Why Electron-Volts Took Over

The electron-volt is actually a physicist’s unit, but it became the default in electrical engineering because of convenience. Typical semiconductor bandgaps, like 1.1 eV for silicon or 1.4 eV for gallium arsenide, line up neatly with the voltages engineers deal with in diodes and transistors. Expressing the bandgap in joules (about \(1.8 \times 10^{-19} \ \text{J}\) for silicon) makes the numbers unwieldy and disconnected from the voltages in practical circuits. Still, the sudden switch in units can feel like a leap to students who expect consistency across physics and engineering.

Franklin’s Original Mistake

Much of this backwardness can be traced to a single historical accident. In the 18th century, Benjamin Franklin studied static electricity and decided to label one type of charge as positive and the other as negative. He had no knowledge of electrons or atomic structure—his choice was arbitrary. Unfortunately, he guessed wrong. Electrons later turned out to be the moving charges in conductors, flowing from the negative side to the positive, the opposite of Franklin’s convention.

By the time this was discovered, the engineering community had already standardized its equations and diagrams around Franklin’s definitions. Instead of reversing course, which would have been disruptive, the field doubled down on the convention. The result is that to this day, electrical engineering teaches a model of current that runs opposite to the actual movement of charges.

Time for Alignment

Electrical engineering prides itself on rigor, yet it clings to conventions that confuse students and separate the field from the rest of science. Why not admit electrons flow from negative to positive? Why not use \(i\) for imaginary numbers like everyone else? Why not present semiconductor models in a way that maps transparently onto physics, without so many layers of invented terminology?

Dropping outdated conventions wouldn’t just make learning easier. It would also show that electrical engineering values consistency, clarity, and respectability as a true applied science.

A Lesson from Chemistry—and Physics

Other sciences have faced the same problem. Chemistry once had a confusing mix of units and naming systems—calories and ergs for energy, symbols that varied across countries, inconsistent definitions of acids and bases. Over time, the field standardized: the mole was adopted, the SI system became universal, and IUPAC created rules for chemical names. Today, chemistry students around the world learn from a consistent playbook.

Physics also corrected its course. The “luminiferous aether” was once considered a necessary medium for light waves. It was taught, defended, and embedded into theory—until experiments proved it unnecessary. Rather than cling to tradition, physicists abandoned the aether, and modern physics is stronger for it.

Electrical engineering could do the same. Aligning more closely with physics and mathematics wouldn’t erase the field’s practical flavor but would make it easier for students to learn, for scientists to collaborate, and for the profession to be respected as a true applied science rather than a discipline speaking its own private language.

For the sake of consistency, and for respect, perhaps it’s finally time for the profession to flip the switch.