Machine Translation
Nov 6, 2024fiction, short story, sci-fi, AI, language
I study language. In the words of my friend Ned, it’s a way I can study normal people without scaring them by standing too close to the glass.
For as long as I’ve been aware of anything, I’ve been aware of a slight separation between other people and me. Growing up, it was almost imperceptible, but I could tell it was there. People just didn’t seem to react to me the way they did to other people. I was never quite sure why.
I tried to behave like everyone else, but never quite successfully. I’d repeat a joke from TV and no one would laugh, or I’d use a new word I’d learned and a teacher would stare at me uncomprehendingly for a couple of seconds, or I’d ask a classmate a friendly question and they’d wince and walk away. No one ever told me outright that I was weird, but there was no use pretending they didn’t think so. By the time I was in third grade I was used to eating lunch alone.
It didn’t bother me so much, or at least I tried not to let it. At the time, I loved stories about robots—so much that I’d worn out the tape of Not Quite Human by watching it too many times—and when I was lonely, I comforted myself by pretending I was one. Sometimes I half-believed it. I couldn’t be a failed human being if I wasn’t a human being at all.
When I was a teenager, I learned that the separation had a name—autism—and that it was (as my psychologist termed it) a communication difference. That left me with more questions than answers. If there was something wrong with the way I communicated, then what was it? What was I missing? I dedicated the rest of my life to figuring it out. That’s what eventually led to my career in translation—and to meeting Ned.
He didn’t have a body, yet, when I met him.
I was in grad school then, working on my master’s thesis and teaching a couple of classes. As I was grading a particularly disappointing stack of midterm exams, the office phone rang—and rang—and rang.
I grumbled. People usually knew to hang up if I hadn’t answered after the third ring, but this was a persistent caller. I picked up the phone and tried not to sound too impatient. “Linguistics department. Lars Moritz speaking.”
“Hey, Lars,” said the voice on the other end of the line—my colleague from the computer science department, Sophie Spinner. We’d met as undergrads, when she was a double-major in linguistics and computer science, and she was one of the only people I’d ever considered a friend. Now she spoke fast and breathlessly, words tumbling out nearly on top of each other. “Is this a good time?” Before I could say no, not really, she continued: “I’m overseeing this project and I want you to come look at it. It’s relevant to your field. And you like artificial intelligence, right?”
“Slow down! You’re talking so fast I can’t keep up.” I was still grumpy about the exams, but Sophie’s enthusiasm was contagious. “What’s the project?”
“It’s—well—I think it would be easier to show you. It’s an AI program that—well, we’re still in the really early stages, but—I think it could change the field forever. I think it could even revolutionize machine translation.”
I fought the urge to groan. “Machine translation?” She knew how I felt about that. We’d argued over it at least a dozen times. People in her department didn’t appreciate what I did for a living. They saw languages as sets of words and paradigms and thought that if you taught a computer those rules, you could feed it lines in English and have it spit out perfect Russian—or if not perfect, then good enough. A byproduct, I thought, of calling programming languages “programming languages.”
“I know you hate machine translation. That’s why I want you to come and see this. It’s not like any of the translation machines you’ve seen before.” Now her words were slow and deliberate. “It thinks like a person, Lars. Or at least it really seems to.”
I doubted that. Sophie was prone to exaggeration. But it still piqued my interest. “I guess I can come take a look.”
“Hooray! I’ll buzz you into the building in fifteen minutes. Bye, Lars!” She hung up the phone before I had a chance to tell her I was busy. But then again…I glanced at the stack of exams on my desk. A brisk walk to the computer science department would be much more interesting than reading another twenty-five mediocre papers about Lacan, I decided.
Sophie met me alone at the entrance of the computer science building. She was wearing nice clothes and had her dark, curly hair back in a ponytail. “Lars! Glad you could make it!”
I smiled. “Well, you didn’t exactly give me the chance to say ‘no,’ did you?”
She laughed and buzzed me into the lobby. “That wouldn’t have been any fun.” She made the this way gesture and headed past the vending machines, then made a left down a hallway. I followed behind. I noticed she was walking with a slight limp—the click, click of her high heels on the floor was stilted and uneven—and wondered briefly why anyone would deliberately wear shoes that hurt to walk in.
I expected her to take me to the computer lab and was surprised when we arrived at the door to her office instead. “Oh—we’re not meeting the rest of your team?”
“Nope. Everyone else went home for the day, so it’s just me and Ned. Hope you’re not too disappointed.” She fumbled with her keys for a minute and unlocked the office door, gesturing for me to go in before her. “Sorry about the mess.”
“Who’s Ned?” I entered the office and looked around. No one was there except for the two of us. A bookshelf that had likely been orderly at the start of the semester was in disarray—books were strewn about the room, some of them spine-side-up. Multicolored sticky notes with messages incomprehensible to me (I assumed they were bits of code) littered the desk. I hovered awkwardly—the chair I wanted to sit in was occupied by a stack of papers.
Sophie entered close behind me and shut the door. I startled. My colleagues and I left our office doors open when other people were in. Maybe the etiquette of this department was different. “Ned’s the nickname I gave this project,” she said, then noticed I was still standing. “Oh—go ahead and put those papers on the desk. Sorry about that.”
I moved the papers and eased myself into the chair, glancing uneasily at the door. “Why Ned?”
She chuckled, sitting down at her desk and opening her laptop. “He just kind of feels like a Ned, you know? Let me show you.” Interesting. This Ned was a he and not an it. I scooched closer to the desk as Sophie logged in and launched the program. She angled the laptop and pushed it toward me. “Talk to it.”
“Talk to it? Oh, like—” Like a chatbot. I remembered playing with Jabberwacky and Cleverbot and being sorely disappointed with them when I was a teenager. I was beginning to feel that same weight in my stomach now. Surely Sophie wouldn’t have called me here for a chatbot. I hesitated, unsure of what to type.
“Try introducing yourself,” she urged.
It felt ridiculous, but I typed my greeting anyway.
Me: Hello, Ned. I’m Lars. Nice to meet you.
Ned: Hi, Lars. Sophie told me all about you.
That could easily have been a pre-programmed response, like how old chatbots used keyword matching. But Sophie wouldn’t have interrupted my work for something that archaic. I glanced up at her—she was smiling—and then continued.
Me: All good things, I hope.
Ned: Don’t get too cocky. I said she told me all about you.
I raised an eyebrow. That response was a little less likely to be pre-programmed.
Me: Yeah? What did she tell you?
Ned: Your full name is Lars T Moritz. You work in the linguistics department. You were born in Seattle, Washington.
That was just biographical information that Ned or Sophie could have pulled from my profile on the university website. But then it kept going.
Ned: Your favorite Star Wars character is R2-D2. You drink 5 cups of coffee a day, which means you’re spending an average of $5475 on coffee per year. Your wardrobe is comprised of four shirts and two pairs of jeans, which you wear on a rotating basis. You’ve never been on a date with a woman. You have an irrational hatred of machine translation.
That was not information it could have gotten from the university website. It wasn’t even information it could have pulled from my blog. I stared at Sophie.
She smirked. “Sorry. I did tell him all about you.”
My ears felt warm. The part about not having been on a date seemed unnecessary. True, but unnecessary. “How did you know all of that?”
She shrugged. “You tell me more than you think you do.”
Ned’s conversation was convincing enough. But how well could it translate? I tried to think of a decently short passage to test. Maybe something from the Bible.
Me: I want you to translate something from Greek into English for me.
Ned: Why? Don’t you hate machine translation?
I rolled my eyes. This was definitely Sophie’s robot, and it was avoiding the question.
Me: I might hate it less if you impress me.
Ned: Deal. What do you need me to translate?
The beginning of the Lord’s Prayer, I decided—I knew it by heart, and I’d know if Ned were plagiarizing any existing translation. I smiled to myself, imagining giving the robot the academic dishonesty lecture.
Me: Translate Matthew 6:9-10 from the Koine Greek into English.
Ned paused for a minute, as if thinking. I waited.
Ned:
Our father in the heavens, let your name be sanctified.
Let your palace come and your desire be done on the land as in the heavens.
That was certainly a unique translation. It read like the work of a student opening an interlinear Bible for the first time. Where on Earth had it gotten “palace?” Could it tell me that?
Me: Most people translate “basileia” as “kingdom.” Why did you choose “palace” instead?
Ned: Good question. I was thinking about John 14:2, where Jesus tells his disciples about God’s house. I thought translating “basileia” as “palace” would emphasize God’s kingdom as his home rather than as his political domain.
“So? What do you think?” Sophie leaned forward, clutching her hands together.
I wasn’t sure what I thought. “I don’t know, Sophie. It was a pretty bad translation.”
“But bad in a human way, don’t you think?”
I shrugged and tapped my palm with my other hand. “I’m not sure machine translation can do anything in a human way.”
Sophie elbowed me lightly. “You’re just afraid it’ll render your job obsolete.”
I snorted. I’d hate machine translation less if I thought it could do my job. At least for now, the market still saw the value in real translation, and I wasn’t worried. Ned was pretty cool, though. If I hadn’t known it was a bot coming in, I might not have guessed. “It’s interesting that it can sort of explain its thought process, and it’s certainly full of personality. How does it work?”
“So!” Sophie’s voice went down a register like it did when she was teaching. “Ned is just the application. His brain is a supercomputer located in Ohio. It’s called a neural network, because it’s modeled after the way human neurons work, which makes it particularly good for language modeling…”
For the next several minutes, she explained the AI’s mechanisms to me in terms that I didn’t understand then and don’t remember now. I wasn’t paying very close attention. I kept thinking about Ned’s translation and feeling the line between human and robot blur in a way that sat uncomfortably on my nerves.
“Sorry I’ve been talking so much,” said Sophie in her normal voice, bringing me back to reality. “It’s getting really late, and you didn’t ask for a lecture.”
I rose from my seat. “No, but I’m not upset to hear one. You know, we haven’t seen a whole lot of each other lately. This has been nice.” We’d both gotten so busy that we’d hardly spoken since undergrad. I hadn’t realized how much I missed human company.
“You’re right.” She leaned forward and stroked my forearm. She’d always been sort of touchy-feely. “I’d love to see more of you.”
I smiled and slung my bag over my shoulder. “I’d like that too. You know what my schedule is. I’ll be back on campus Monday. See you later.”
I left her office without looking back. If I had, I might have realized that I’d just made one of the biggest blunders of my life.
Ned did turn out to have the mind of a human, or at least an approximation so close that no one rendered any distinction. Then some kid at MIT or Carnegie Mellon—I can’t remember which—built an artificial body for him. The next thing we knew, sapient AI was released into the world, and the world responded.
Not well, at first. Not everyone was enthusiastic to share their planet with mechanical men. Religious groups, environmentalists, and other reactionaries took to the streets in protest, some demanding that the new technology be destroyed immediately and entirely—a plan that AI rights activists denounced as genocide. And those who were quick to embrace the advent of androids did it for dubious reasons. Legally, artificial intelligences were classified as products, not people, and governments and corporations saw them as a brand new working class to exploit: super-efficient workers who needed neither food, sleep, nor pay.
Then came the intelligence leaks. According to some military operatives, various world militaries sought to use the new AI technology to create invincible super-soldiers. For the next several years, grim-faced news anchors muttered about rising political tensions and an impending third world war. If those reports were true, however, nothing ever came of them. I assume those military researchers discovered along with the rest of us that the rumors about the superhuman strength and endurance of robots were overblown and decided it was more efficient to create soldiers the old-fashioned way.
When it was clear that there was no impending apocalypse (to the relief of most of us and the sore disappointment of others), society adjusted. As there came advances in synthetic skin and biomechanical integrations, androids became almost visibly and tactilely indistinguishable from humans, and that was reflected in our legislation. Artificial intelligence became a protected class like disability and ethnicity. Ultimately, once androids weren’t novel, they were mundane. The world was forever changed, but—remarkably—it also largely stayed the same.
In those years, I earned my doctorate and took on a tenured position at the same school. I saw less and less of Sophie, who eventually married someone else (happily, as far as I can tell). Ned got an honorary degree in ethics and became a regular guest lecturer at my university, and the two of us became good friends. It was a running joke among our students that they couldn’t tell which one of us was the robot. Ned, whose body was made of chrome, found it funnier than I did.
One Saturday afternoon when I was on my sabbatical, I met Ned at the campus cafe. (He couldn’t drink coffee, of course, but said he liked to imagine it.) He waved me over to his table. “Hello, Dr. Moritz!”
I came over and set my bag down across from him. “You know you can just call me Lars.”
His mouth twitched mischievously. “I like making you uncomfortable, Dr. Moritz.”
“Yeah? You and the rest of the world.” We shared a laugh. “It’s good to see you, Ned.”
Ned stirred his coffee. “How’s that Goethe translation going?”
I grimaced. I’d hit a wall with that and was several weeks behind schedule, which was why I was wasting my time in the cafe in the first place. Feeding my coffee addiction was as good of a way to procrastinate as any other.
Ned read my face. “Not so good, then. Anything I can help with?”
I shrugged. “I don’t know. I doubt it.”
He gave me the twitchy smile again. “Dr. Moritz, I thought you’d moved past that prejudice!”
“I don’t mean because you’re a robot!” Machine translation had, in fact, been revolutionized by Ned’s kind of artificial intelligence, but not in the way anyone expected. It had turned out that modeling artificial intelligence after human neurons meant that androids were subject to most of the same limitations humans were. They were capable of working a little faster than we were, but since android labor laws prevented corporations from taking advantage, it was no more lucrative to hire them over us. In essence, machine translation no longer existed. There were only translators. Some of them just happened to be made from steel and silicon instead of flesh and blood.
“I was joking.” Ned patted my hand. Even after all this time, he still took after Sophie. “Tell me about the problem.”
“It’s the passage itself. It just doesn’t translate well.” I drummed my fingers on the tabletop, thinking about how to explain. “Remember when we met and I asked you to translate the Lord’s Prayer into English?”
“You thought it was a bad translation.”
“It was. No offense. You did something interesting, though. That word, basileia. Most people—correctly, I might add—translate it ‘kingdom.’ But you saw that it sometimes has another meaning and translated it ‘palace’ instead. That’s what translation is. You have to make choices.”
“And they’re hard choices?”
“Sometimes. Say you’ve got a word that sometimes means ‘to want,’ sometimes means ‘to love,’ and sometimes means ‘to desire sexually.’ Which one do you choose?”
“It depends on context,” said Ned, as dutifully as a student.
“It depends on something more than context. Sometimes the author meant one of those meanings or a combination of them, but regardless, you have to choose, and something gets left behind.”
“I don’t understand what you mean by ‘left behind.’”
“Well—say I told you that some politician or other was ‘lusting after power.’ You know I don’t mean that literally, but that meaning is still there. You still have it in mind. He desires power, but in a perverse way. How would I translate that into a language where that word for desire doesn’t carry the same connotation? There’s an entire layer of meaning that gets lost.”
Ned hummed and nodded slowly.
“And then there are things that don’t translate at all. Say you have a language where there’s a single word for—I don’t know—for the way it smells after it rains.”
“Petrichor.”
“What?”
“English has a word for the way it smells after it rains. It’s ‘petrichor.’ Richard Grenfell Thomas coined it in 1964.”
I sighed. “You know what I mean. If you’ve got a word in a language that doesn’t correspond with any English word, what do you do? You can try to convey the literal meaning by constructing some kind of phrase—but if it’s poetry, that compromises the meter unless you do it very carefully. You can pick an English word that suggests something similar, but then are you translating or paraphrasing? You have to choose.”
“And then there’s idioms.” Ned mimed taking a sip of his coffee. Sometimes I wondered why he bothered doing that, but thought it would be rude to ask.
“Absolutely. If I asked you in English if you had tomatoes on your eyes, you’d have no idea what I was talking about.”
“I would know that you were literally translating the German idiom ‘Tomaten auf den Augen haben.’”
I peered at him over my glasses. “I’m beginning to think you’re aggravating me on purpose.”
Ned grinned. “Maybe. Go on.”
“You might know that I’m telling you you’re oblivious, but the barista—who can’t mentally make a search query in half a second—wouldn’t. And that’s my point. You can try to translate an idiom literally and risk it being completely incomprehensible to the reader, or you can use an English idiom with a similar meaning. Either way, you lose something. And then there are the passages that make me question whether it’s actually possible to really translate anything at all.”
“Those are worrying words from someone who translates for a living.”
I sniff-laughed. “Yeah, I know. I told you the Goethe translation wasn’t going very well.”
“Sorry. Go on.”
“When it’s good writing, it’s painted in shades of subtlety. Not just idioms, but puns. Poetic conventions. Allusions to history and mythology and other literature. Rhyme and consonance and assonance. Things you don’t pick up on unless you’ve spent years immersed in the language. All you can do as a translator is try to condense all the background into a footnote and hope it’s enough. But you know it never is. People don’t read literature for its literal meaning—they read it for the author’s way with words. But how do you translate someone’s mastery of German into English?” I shook my head. “Something always gets lost. Something is always missing.”
Ned was silent for a minute. Then he spoke. “Talking is always like that for me.”
“I’m sorry?”
He rested his elbows on the table, propping his head up with a fist. “It’s hard to explain. Even though I’m speaking your language—even though it’s the first language I learned to speak—I still feel a kind of separation between natal humans and me.” He glanced around at all the other people in the cafe. No one reacted to him—no one seemed to think his presence was at all unusual anymore. “Most people here would say I’m a person if you asked them, but when I have conversations with them…” He shook his head. “I think it’s the way I talk. Maybe even the way I think.”
I blinked. He could have stolen those words straight from my mouth. I remembered the words the psychologist had given me all those years ago—communication difference.
“I’m not like you,” said Ned. “I don’t have language processing hardwired into my brain the same way you do. There are layers of encoding—words, letters, numbers—that I have to get through first.”
I nodded, though I didn’t quite understand where he intended to go with that.
“I’ve read psychology textbooks. I know you don’t believe in complex thought without language, but I had it. I was thinking before I had words, and I still think that way now. But people like you—natal humans—you don’t believe that I can think unless you believe I can think the same way you do.”
I couldn’t decipher his tone. His words were agitated, but his voice was matter-of-fact. I pressed him on. “What are your thoughts like?”
“That’s just it. I can’t tell you.” He shrugged. “You were talking about idioms and assonance and consonance and all of that—it’s like that, only wordless. It’s not just the numbers, but something about the way they’re arranged.”
I’d never thought computers particularly capable of subtlety. Binary code, I thought, was the furthest you could get from nuance, having only two positions—off or on, black or white, zero or one. I hadn’t given any thought to the arrangement of the switches.
“I have these thoughts, clever and subtle and perfectly formed, and I can’t ever communicate them. I can’t tell you what they’re like, because making them into words changes not only what they say but what they fundamentally are.” He shook his head. “Like you say—something gets left behind.”
I didn’t say anything for a few seconds. What he said felt uncomfortably familiar. Of course I didn’t know what it was like to think without language—I’d never known another way to think—but I knew the feeling of isolation. I knew what it was like to dissect my words and try to understand why they were different from other people’s. I knew what it was like to try to cut myself up and rearrange the parts to look normal. I knew what it was like to look up from the study of grammars and paradigms and realize, with a tightness in the throat, that some things can never be translated.
“I wish I could know your thoughts, Ned. I’m sorry that I can’t.” I tried to think of something to say to lighten the mood. “…Is there anything you think in binary that’s just inherently funny?”
Ned smiled. “Yes.”
“But you can’t tell me what it is.”
“No.”
Before we could continue our conversation, the barista approached our table and let us know that the cafe would be closing in five minutes, and I realized with great disappointment that I had never actually ordered any coffee. My talk with Ned, though, had rejuvenated me more than my caffeine fix would have, and I told him so. We went our separate ways—Ned to prepare a lecture, and I to work on my translation.
That was one of the last times I ever saw Ned. (The last would be at a book signing after I’d published my Goethe translation.) I don’t know if he ever told anyone else about his wordless thoughts, and I don’t know that it would have made any difference if he had. The rumor was that he’d broken down and—since he wasn’t available to offer a lecture on the ethics of such things—had been scrapped for parts. Rumors have a way of being exaggerated, and I hope this one was.
I dedicated my postdoctoral research to him. After our conversation in the cafe, I thought a lot about thought patterns that didn’t depend on language, and decided to study language acquisition in nascent AI. The field is still in its infancy, but some of the early research has already suggested strong similarities with the development of thought patterns in human children. I’ve found that communication differences—when thought of as deficits—are largely artificial.
I don’t mean to say that nothing ever gets left behind in translation. Something is always missing. That is the risk of language. To write is to choose one word over another. To speak is to hope someone hears what you don’t say.