Transfiguration
|
Issue 12
|
Father Thomas sat silently in the first row of the empty sanctuary and held his head in his hands. He thought about his time with Brendon and considered what to do now that he had literally ceased to exist. Thomas himself was to blame. He rose, walked to the small Altar of St. Mary in the south transept, and knelt before the votives. The crimson glow from the candles gave his 60-year-old face a woeful look. He closed his eyes. He prayed.
Thomas had met Brendon two Sundays back when he came to the rectory looking for the priest. “I have heard that you often give good counsel, Father,” he said after introducing himself. “I seek yours.” He had given only his first name. He said, rather cryptically, he wished to understand “how to believe”. The two faced each other in comfortable armchairs before a quiet fireplace. Thomas appraised the younger man, probably in his 30s, modestly dressed and clean-shaven. His dark hair was curly, uncombed but well-kept. He seemed earnest. “Do you believe in God?” Thomas had begun the conversation. Brendon had not been hesitant. “I am trying to come to terms with belief.” “I need to know, my friend, what it is precisely you seek from me—a restoration of faith or the very initiation of it.” Brendan had leaned forward, his eyes seeming to penetrate the priest in a sincere effort to connect. “The latter is the case. I have never experienced the former.” His expression showed neither anxiety nor urgency. “It is not in knowing, Father, but in believing. This is what I seek help with.” “It may be that you need a psychiatrist rather than a priest,” Thomas intoned his words so as not to be accusing or dismissing. Brendon appeared to be well-disposed, not cynical. “Tell me what it is about belief that troubles you.” “I am not troubled, actually, Father,” he seemed earnest in narrowing his intent. “I understand that belief is an important part of organizing how we understand the world.” Brendon spoke softly, as if wanting this thought to linger, not be discarded. “It is a missing part of my world.” Thomas studied the other’s demeanor and inflection as he spoke. Something struck him as unusual about Brendon, as if he were choosing words to hide as well as reveal. Parishioners do this, of course, in confession. “What is missing in each of us,” Thomas met the other’s gaze directly, “is often what lies hidden in the deepest part of our souls.” He looked at Brendon intently. “It is sometimes an uncomfortable search.” “That is precisely the search I’m seeking help with.” Brendon sat back in his chair, hands in his lap. “Father, I have no soul, and I wish for us to work together in fashioning one for me.” “Everyone has a soul, Brendon.” “I’m an android, Father.” A shock. Brendon said it as a matter of fact. The change was immediate. Thomas was suddenly both astounded and disheartened, by how deftly this artifice lay hidden from his sensibilities and by how quickly his vocation was altered. Have I now no redemption to offer? no confession to hear? “How swiftly you dismiss me, Father,” the android could see it in the face, the body language, “because you find my plight is no longer a human one!” Thomas checked himself, trying to relax his stiffness, trying to regain his composure. Trying, in fact, to return to a priestly frame of mind. “I’m sorry,” he was still shaken, “but I find it difficult to believe that our technology has become this advanced.” “I was designed,” Brendon explained, “as a customer-relations assistant for Consolidated Telecom. I meet customers who come in with minor cell phone problems. I am learning as I go.” He hesitated, “I reside and work in your parish,” as if to explain why he chose this priest. “Why, then, need you be concerned about having a soul?” Curiosity had now replaced concern in the priest’s manner. “An android is not a creature of God.” “In my work, I have come to realize that a degree of empathy is important, Father. Understand, this is not something that can be conveyed in machine-learning algorithms.” Brendon spread his hands to invite agreement. “But it is something inherent in the soul, is it not? Feeling? Compassion?” |
Ron Wetherington, a retired anthropologist living in Dallas, Texas. He has a published novel, Kiva (Sunstone Press, 2014), creative non-fiction, including prose-poems, in The Dillydoun Review, Literary Yard, and The Ekphrastic Review, and short fiction in Words & Whispers, Adanna, Androids & Dragons, and in Flash Fiction
|
❦
This was how the transfiguration in Thomas’s frame of mind began—at the same moment that Brendon’s frame of reference began to shift. It was to become a conversion experience for both, but obviously not a mutual one, nor even mutually intelligible.
Thomas sat silently, pinching the bridge of his nose, eyes closed. Then, suddenly, he rose.
“Come,” he smiled, “let us walk in my garden.”
The small garden was cloistered by a high brick wall coped with stone. Flowers, tall and short, flanked the narrow pathway, assaulting the senses with color and fragrance as the two of them entered. Thomas smiled, “There are formal gardens and sculpture gardens. Mine is a fragrance garden.” Instead of fighting each other, the aromas partitioned the garden in protective zones of lavender, gardenia, jasmine, hollyhock, each assailing in its turn. As they strolled Thomas watched for a reaction. “Do you have the sense of smell, Brendon?” Brendon paused. “I do.” The priest continued, “Do you have a favorite fragrance?” he asked.
Brendon inhaled deeply, closing his eyes, his brow contracted. “The aromas are quite different,” he murmured, almost to himself. “Pleasant.” They sat on a small stone bench beneath a towering rhododendron. “But is there one more pleasing than the others? Do any invoke an emotional response?”
“No,” Brendon opened his eyes and looked at the priest. “I cannot say that they do.”
“Then how do you find them pleasant and not noxious?”
“I think my creators needed me to distinguish dangerous fumes for my body’s sake.”
“And what is not dangerous is therefore ‘pleasant’? Not just neutral?”
Brendon considered the question. “I cannot be certain, Father, but my coding likely requires some value assessment of all sensory input: sights, sounds, touch.” He squeezed his eyes closed briefly, and then continued, “I’m choosing ‘pleasant’ as the appropriate word, Father, but perhaps I should have said ‘agreeable’. I cannot claim that a fragrance conveys pleasure, since the word is not meaningful to me.”
There followed a period of silence between the two. Thomas detected a provocative paradox: while there were elements of self-identity here, there was only a slight indication of self-consciousness. Brendon was apparently struggling with this. But how is this possible? Can an android actually reflect, or be taught to?
Thomas sat silently, pinching the bridge of his nose, eyes closed. Then, suddenly, he rose.
“Come,” he smiled, “let us walk in my garden.”
The small garden was cloistered by a high brick wall coped with stone. Flowers, tall and short, flanked the narrow pathway, assaulting the senses with color and fragrance as the two of them entered. Thomas smiled, “There are formal gardens and sculpture gardens. Mine is a fragrance garden.” Instead of fighting each other, the aromas partitioned the garden in protective zones of lavender, gardenia, jasmine, hollyhock, each assailing in its turn. As they strolled Thomas watched for a reaction. “Do you have the sense of smell, Brendon?” Brendon paused. “I do.” The priest continued, “Do you have a favorite fragrance?” he asked.
Brendon inhaled deeply, closing his eyes, his brow contracted. “The aromas are quite different,” he murmured, almost to himself. “Pleasant.” They sat on a small stone bench beneath a towering rhododendron. “But is there one more pleasing than the others? Do any invoke an emotional response?”
“No,” Brendon opened his eyes and looked at the priest. “I cannot say that they do.”
“Then how do you find them pleasant and not noxious?”
“I think my creators needed me to distinguish dangerous fumes for my body’s sake.”
“And what is not dangerous is therefore ‘pleasant’? Not just neutral?”
Brendon considered the question. “I cannot be certain, Father, but my coding likely requires some value assessment of all sensory input: sights, sounds, touch.” He squeezed his eyes closed briefly, and then continued, “I’m choosing ‘pleasant’ as the appropriate word, Father, but perhaps I should have said ‘agreeable’. I cannot claim that a fragrance conveys pleasure, since the word is not meaningful to me.”
There followed a period of silence between the two. Thomas detected a provocative paradox: while there were elements of self-identity here, there was only a slight indication of self-consciousness. Brendon was apparently struggling with this. But how is this possible? Can an android actually reflect, or be taught to?
❦
In the days before their second meeting—both Thomas and Brendon agreed to one—Thomas did a lot of what was, literally, soul-searching. Am I morally right to continue? Is it part of my ministry?
Am I playing God?
He took these concerns directly to Bishop Jonah Andrews, his ecclesiastical superior. The bishop was his age superior, too, well into his 70s, portly, with thick white hair and kindly eyes. His plump face and reddened cheeks gave him a cheery appearance even when he was not. Today, he was. They had known each other for decades and were on a casual first-name basis. Which was fortunate, thought Thomas, who was determined to get counsel without revealing more than he considered prudent.
“I wonder, Jonah, with the great advances in AI, how we are to treat the spiritual implications of a non-human intelligence.”
The Bishop folded his hands on his portly stomach. “I daresay the concern is premature, Thomas. Do you sense an alien presence, or is Alexa seeking spiritual guidance?” He chuckled, then realized this was not idle banter. “Why should this be an issue?”
“The Pope has already expressed concern over the risks,” Thomas said, needing to establish an ecclesiastical context. “And with machines getting close to acquiring human agency, it might be wise to clarify the Church’s position and...”
“Robots are not human, Thomas,” Jonah interrupted, “and as far as I know have yet to think for themselves.”
“And yet, Jonah, what happens when they do? We have had no conversations about the spiritual component of artificial intelligence—either from the human perspective or the robot’s.”
“Are you suggesting that we advocate infusing a conscience into artificial consciousness? To make it properly pious?” Jonah was at the edge of sarcasm.
“Are we to play Pygmalion to a Galatea?”
Thomas gave him a wry smile. “If not we, then who?”
Jonah sat upright, hands on the arms of his chair. “I get a strong feeling, Thomas, that we’re not speaking about hypotheticals.” Thomas saw a frown forming. “Tell me you’re not consorting with a golem!”
“A golem?”
“The mythical Hebrew creature fashioned from dust and ritually imbued with a soul. The golem becomes a companion, a rescuer.” Jonah settled quickly into his storyteller mode. “In medieval times, it was how mystics could become closer to God—as co-creators. We encounter it in the imaginative articulation of Psalm 139, in the Jewish bible.” Jonah quoted,
Thine eyes did see my golem, yet being unformed; and in Thy Book all the days ordained for me were written down, when as yet there were none of them.
“No, Jonah,” Thomas laughed, “I have no golem. But I take it you’re suggesting it is not in our priestly duties to minister to AI.”
“And not in our moral duties, either,” Jonah snorted. “Indeed, we must go beyond the warnings of His Holiness. We must endeavor strongly to prevent its humanization.”
“Alas, Jonah, that’s a mountain we’ll never scale,” Thomas answered in a somewhat doleful voice. “To mix metaphors, our train has left the station, I’m afraid.”
Am I playing God?
He took these concerns directly to Bishop Jonah Andrews, his ecclesiastical superior. The bishop was his age superior, too, well into his 70s, portly, with thick white hair and kindly eyes. His plump face and reddened cheeks gave him a cheery appearance even when he was not. Today, he was. They had known each other for decades and were on a casual first-name basis. Which was fortunate, thought Thomas, who was determined to get counsel without revealing more than he considered prudent.
“I wonder, Jonah, with the great advances in AI, how we are to treat the spiritual implications of a non-human intelligence.”
The Bishop folded his hands on his portly stomach. “I daresay the concern is premature, Thomas. Do you sense an alien presence, or is Alexa seeking spiritual guidance?” He chuckled, then realized this was not idle banter. “Why should this be an issue?”
“The Pope has already expressed concern over the risks,” Thomas said, needing to establish an ecclesiastical context. “And with machines getting close to acquiring human agency, it might be wise to clarify the Church’s position and...”
“Robots are not human, Thomas,” Jonah interrupted, “and as far as I know have yet to think for themselves.”
“And yet, Jonah, what happens when they do? We have had no conversations about the spiritual component of artificial intelligence—either from the human perspective or the robot’s.”
“Are you suggesting that we advocate infusing a conscience into artificial consciousness? To make it properly pious?” Jonah was at the edge of sarcasm.
“Are we to play Pygmalion to a Galatea?”
Thomas gave him a wry smile. “If not we, then who?”
Jonah sat upright, hands on the arms of his chair. “I get a strong feeling, Thomas, that we’re not speaking about hypotheticals.” Thomas saw a frown forming. “Tell me you’re not consorting with a golem!”
“A golem?”
“The mythical Hebrew creature fashioned from dust and ritually imbued with a soul. The golem becomes a companion, a rescuer.” Jonah settled quickly into his storyteller mode. “In medieval times, it was how mystics could become closer to God—as co-creators. We encounter it in the imaginative articulation of Psalm 139, in the Jewish bible.” Jonah quoted,
Thine eyes did see my golem, yet being unformed; and in Thy Book all the days ordained for me were written down, when as yet there were none of them.
“No, Jonah,” Thomas laughed, “I have no golem. But I take it you’re suggesting it is not in our priestly duties to minister to AI.”
“And not in our moral duties, either,” Jonah snorted. “Indeed, we must go beyond the warnings of His Holiness. We must endeavor strongly to prevent its humanization.”
“Alas, Jonah, that’s a mountain we’ll never scale,” Thomas answered in a somewhat doleful voice. “To mix metaphors, our train has left the station, I’m afraid.”
❦
Thomas was mildly ashamed to have misled Jonah but knew well the alternative. Being forthcoming would have been unproductive at best. He was conflicted, nonetheless. Jonah was right— Thomas had no business dealing with Brendon once his non-human identity was known. But, he reasoned, am I disobeying God to want to give closure to another being, human or not? Was it even possible to help him? He could not simply abandon
Brendon, though Jonah would have insisted on this.
Thomas prayed for guidance. He received none.
He prayed for patience.
They met a second time on a previously arranged Sunday afternoon. Brendon was wearing the same clothing, and Thomas briefly wondered how often the android changed, whether he dressed himself. Did he have batteries that required charging? Did he need permission to leave his lodgings? The many unanswered questions took a back seat to the immediate ones: how do we invest Brendon with a soul—a moral self—if it is indeed possible? And what spiritual implications does the very idea introduce? Thomas simply could not let this pass. Still, he was fearful of pursuing it.
They sat in the same room, the same chairs, facing each other. Brendon took the initiative. “I wonder, Father, how you wish to proceed in my case. This goes beyond my capacities.” Precisely the matters that had been plaguing Thomas. We will have to approach these together, he thought. Each of us has resources the other does not.
“I must confess, Brendon,” Thomas leaned forward, his hands folded, “I am in some torment over this, and so I need first to understand much better what your capacities are and more precisely what you think you need.”
“May I ask you to explain the nature of this torment? I do not wish my dilemma to misrepresent itself.” Brendon, it was apparent, was neither tormented nor conflicted by the ‘dilemma’. The priest labored over how to respond. He decided honesty and brevity were both called for.
“My duties, Brendon, are to lift the soul from despair and restore it to its divine purpose. This is reconciliation. Since you have no soul, I have no such duty to you.” A heavy sigh. “But if I succeed in implanting a soul, I thereby assume a responsibility beyond my priesthood: I become your divinity. I am ill-suited for this burden.”
Brendon’s face brightened. “Then my dilemma has certainly been misrepresented,” he said.
“It is not a soul in search of divinity that I seek, it is a soul in search of feeling. It is not belief in your God that I seek, it is the very capacity to believe.” After the slightest hesitation, he continued, “I now understand that the terms soul and belief are differently understood by the two of us.”
Indeed they had been! Brendon sat back, smiling with satisfaction. English words change their meaning over time, he recalled, tracing his machine learning back to Aristotle. The priest was noticeably relieved. But not entirely. In helping him acquire the capacity to feel, to have opinions, to make moral judgments, Thomas would be in effect molding a conscience for the android. This is what the bishop had been adamantly against. Without the methodical and slow process of human growth and life experience; without parental, peer, and pastoral influences so critical in creating disposition and integrity, what could prevent everything from going astray? Thomas suddenly had the image of C.S. Lewis’s Screwtape, laboring with his nephew to send a Christian to the embrace of Satan.
Brendon crossed his legs and relaxed. “Tell me, Father, why does it seem to bother you that an android is asking for the ability to have feelings?” He held out his hands in surrender. “Do you think that having reflective thoughts would give me evil designs? Would experiencing philosophy, rather than simply knowing it, threaten my innocence with corruption?”
It was almost as if his mind were being read, Thomas thought. But another, more alarming possibility now suddenly slipped into his awareness. Brendon was determined. If he were not guided by a servant of God, he would assuredly be facilitated by someone. To what devious end could his emergent character be molded then? To what eventual horror? Thomas would not have it!
“My apologies,” he bowed his head. “I do you a disservice.” He sighed heavily. “So, let’s begin our quest by discovering how elastic your limitations are.” They spent the next hour and a half, priest and android, human and non-human, exploring the makeup of thoughts and feelings. They were cautious in their progress. The wiring of their neural circuits was different, passing across visual, auditory, and other sensory centers unlike one another, but cycling in similar ways. Electrical impulses moved out, encountering realities beyond, and returned in images and sensations, made coherent by memory, transforming into thought and then converting to language. Patiently, meanings were assembled. Evaluations were processed. Moral positions were shaped.
Thomas felt exhausted, drained of mental energy but enlivened by the slow awakening, in both, of their common realities. It was surprising and mildly disquieting. For Thomas it was discovering a new species; for Brendon, it was discovering untapped potential. The transfiguration was almost complete for each. They both looked forward to the following Sunday.
Brendon, though Jonah would have insisted on this.
Thomas prayed for guidance. He received none.
He prayed for patience.
They met a second time on a previously arranged Sunday afternoon. Brendon was wearing the same clothing, and Thomas briefly wondered how often the android changed, whether he dressed himself. Did he have batteries that required charging? Did he need permission to leave his lodgings? The many unanswered questions took a back seat to the immediate ones: how do we invest Brendon with a soul—a moral self—if it is indeed possible? And what spiritual implications does the very idea introduce? Thomas simply could not let this pass. Still, he was fearful of pursuing it.
They sat in the same room, the same chairs, facing each other. Brendon took the initiative. “I wonder, Father, how you wish to proceed in my case. This goes beyond my capacities.” Precisely the matters that had been plaguing Thomas. We will have to approach these together, he thought. Each of us has resources the other does not.
“I must confess, Brendon,” Thomas leaned forward, his hands folded, “I am in some torment over this, and so I need first to understand much better what your capacities are and more precisely what you think you need.”
“May I ask you to explain the nature of this torment? I do not wish my dilemma to misrepresent itself.” Brendon, it was apparent, was neither tormented nor conflicted by the ‘dilemma’. The priest labored over how to respond. He decided honesty and brevity were both called for.
“My duties, Brendon, are to lift the soul from despair and restore it to its divine purpose. This is reconciliation. Since you have no soul, I have no such duty to you.” A heavy sigh. “But if I succeed in implanting a soul, I thereby assume a responsibility beyond my priesthood: I become your divinity. I am ill-suited for this burden.”
Brendon’s face brightened. “Then my dilemma has certainly been misrepresented,” he said.
“It is not a soul in search of divinity that I seek, it is a soul in search of feeling. It is not belief in your God that I seek, it is the very capacity to believe.” After the slightest hesitation, he continued, “I now understand that the terms soul and belief are differently understood by the two of us.”
Indeed they had been! Brendon sat back, smiling with satisfaction. English words change their meaning over time, he recalled, tracing his machine learning back to Aristotle. The priest was noticeably relieved. But not entirely. In helping him acquire the capacity to feel, to have opinions, to make moral judgments, Thomas would be in effect molding a conscience for the android. This is what the bishop had been adamantly against. Without the methodical and slow process of human growth and life experience; without parental, peer, and pastoral influences so critical in creating disposition and integrity, what could prevent everything from going astray? Thomas suddenly had the image of C.S. Lewis’s Screwtape, laboring with his nephew to send a Christian to the embrace of Satan.
Brendon crossed his legs and relaxed. “Tell me, Father, why does it seem to bother you that an android is asking for the ability to have feelings?” He held out his hands in surrender. “Do you think that having reflective thoughts would give me evil designs? Would experiencing philosophy, rather than simply knowing it, threaten my innocence with corruption?”
It was almost as if his mind were being read, Thomas thought. But another, more alarming possibility now suddenly slipped into his awareness. Brendon was determined. If he were not guided by a servant of God, he would assuredly be facilitated by someone. To what devious end could his emergent character be molded then? To what eventual horror? Thomas would not have it!
“My apologies,” he bowed his head. “I do you a disservice.” He sighed heavily. “So, let’s begin our quest by discovering how elastic your limitations are.” They spent the next hour and a half, priest and android, human and non-human, exploring the makeup of thoughts and feelings. They were cautious in their progress. The wiring of their neural circuits was different, passing across visual, auditory, and other sensory centers unlike one another, but cycling in similar ways. Electrical impulses moved out, encountering realities beyond, and returned in images and sensations, made coherent by memory, transforming into thought and then converting to language. Patiently, meanings were assembled. Evaluations were processed. Moral positions were shaped.
Thomas felt exhausted, drained of mental energy but enlivened by the slow awakening, in both, of their common realities. It was surprising and mildly disquieting. For Thomas it was discovering a new species; for Brendon, it was discovering untapped potential. The transfiguration was almost complete for each. They both looked forward to the following Sunday.
❦
But Brendon failed to show at the appointed hour. Instead, a stranger stood at the rectory door. “Father Thomas?” he asked. “I regret that Brendon will not be returning to confer with you.” Confused,
Thomas invited the man in. “I’m Chief of Technology at Consolidated Telecom,” he introduced himself. “We detected unauthorized activity by Brendon as the two of you consulted last week. He has been reprogrammed.” The man quickly added, “I want to assure you that you are completely blameless and that your confidentiality is not in the least compromised.”
“Reprogrammed? What exactly does that mean?” Thomas was aghast.
“Simply put,” the man explained, “Brendon is now Brenda. She retains only vague memories of the past and will resume functioning as originally intended.”
“But how…”
“There is a small implant at the base of the neck behind the head,” the man said. “It contains a chip with proprietary surveillance technology.” The man raised his palm. “Rest assured it did not transmit any of your conversations, Father; only impulses from some of the android’s deeper neural layers that revealed unsanctioned channels of intent. A switch at the implant can also disable transmission.” He flashed a conspiratorial smile. “I believe the Pope’s words were ‘We need proper human control over the choices made by AI’.” The man paused at the door before leaving. “We’re trying to honor that sentiment, Father. I’m sure you understand. It’s for our protection.”
Thomas invited the man in. “I’m Chief of Technology at Consolidated Telecom,” he introduced himself. “We detected unauthorized activity by Brendon as the two of you consulted last week. He has been reprogrammed.” The man quickly added, “I want to assure you that you are completely blameless and that your confidentiality is not in the least compromised.”
“Reprogrammed? What exactly does that mean?” Thomas was aghast.
“Simply put,” the man explained, “Brendon is now Brenda. She retains only vague memories of the past and will resume functioning as originally intended.”
“But how…”
“There is a small implant at the base of the neck behind the head,” the man said. “It contains a chip with proprietary surveillance technology.” The man raised his palm. “Rest assured it did not transmit any of your conversations, Father; only impulses from some of the android’s deeper neural layers that revealed unsanctioned channels of intent. A switch at the implant can also disable transmission.” He flashed a conspiratorial smile. “I believe the Pope’s words were ‘We need proper human control over the choices made by AI’.” The man paused at the door before leaving. “We’re trying to honor that sentiment, Father. I’m sure you understand. It’s for our protection.”
❦
Thomas finally arose from the kneeling pad before the glowing votives, crossing himself at the Altar of St. Mary. He was still stunned, feeling violated. He wondered exactly what kind of protection his visitor had been referring to. Philosophy will clip an angel’s wings, Keats had warned; Conquer all mysteries by rule and line. From what wickedness would denying Brendon the experience of mystery protect us?
Perhaps he should visit Brenda at Consolidated Telecom and learn what she thinks, what she remembers, and what they might discover together if her transmission is disabled.
Perhaps he should visit Brenda at Consolidated Telecom and learn what she thinks, what she remembers, and what they might discover together if her transmission is disabled.