Talks About Transhumanism (p. 2)

Paragraphs starting without a name are ones written by me, neptun.
[ Read More ]

I have a belief that we'll be able to make machines which are capable of performing every human task while remaining non-sentient, and ALSO be able to make machines which are sentient, just like humans.

I believe that sentience is a simple matter of modifying the rewards system to either update dynamically with the environment as a human's does or to remain static and straightforward as a robot might.

I don't REALLY understand these systems but it's what I believe.

Need a robot lover? Make it sentient. You'll never get lonely that way. Robot slave? Non-sentient. Simple as. I'd be able to simulate all the torture and death that I want without having a shred of moral doubt.
.

[ Read More ]

i really really really really want an AI girl. regular girls don't satisfy me. it's not a fetish thing, it's just that the logistics don't work out with regular girls. i have my own ambitions, you see. and i want a girl that will follow along with those ambitions most effectively, instead of going off to play games, sleep, eat, take showers, and do her own thing all the time.

on the other hand, i'd love to see her go off and do her own thing if she's creating something beautiful, whether that be a gift to me, a game to play, or a writing a book.
.

[ Read More ]


I was watching this video and noticing how I didn't like it because it's vibe was too ASMRish. Like a dude licking your ear when you're not into guys, for me, the video had a violating quality to it. And the music was kinda cringe.

i imagine in the future going to someone else's virtual home and being unable to handle it because the vibe sucks. it's too stimulating in all the wrong ways. sun is too bright, food is too sticky, and just walking around the place feels gross. maybe it's stimulation that they like, but i'm just not into their vibe, i'm not into the feelings that their world tries to promote.

one feeling that i love is a sort of invigoration born from fear. if i could have a virtual home where my brain is hardwired to feel the right kind of fear so that that invigoration triggers naturally, i think i'd love that place. but having someone not be into that would be completely understandable. it's fear, which is generally regarded as a negative emotion.

<histoire> Now that's an interesting concept, though I've never given it much thought.

<histoire> I'm now imagining this much like websites. Perhaps you'd be able to edit them in much the same way, or you'd try to keep it as authentic as possible, depending on what you enjoyed.

<histoire> Your home would be awful for me lmao

I've always imagined it to be like websites. Everybody has their own virtual land which we share with each other. Just like JanusVR. Was it not like that for you before?

<histoire> I honestly never thought about it.

<histoire> It clicks together perfectly now that I have, but I simply hadn't considered it at all before.

by the way, my home doesn't have to be awful for you! i could keep a guest home, where i visit guests, or have the special fear place be a room in the back of my house not really associated with the majority of the home's functions.

Speaking of home's functions, our homes sure are going to be a lot simpler once we're mind uploaded. We won't need to eat, go to the toilet, shower, store clothes, store tools, need a car, etc. to live out our daily lives online.

Most of what we'd use the home for could be represented in GUI instead. For example, if I needed to sleeve myself in an android body, I COULD have an interface for doing that installed in my house. Think a big, teleporter pad looking thing with some buttons on the side for settings. Alternatively, I could just have a GUI window for it which becomes available when I'm inside of the computer connected to the sleeve.

<histoire> The ambient fear effect could also just be a simple on/off toggle for you.

<histoire> I wonder how many people will forgo traditional style digital homes, where most IRL rooms are present vs those that will keep the looks.

<histoire> Of course that's not even considering the ones that would have it be traditional type buildings , but in configurations that normally wouldn't be homes.

<histoire> Like a fantasy library or something similar

Some people might still clean house, eat food, and even still shit in VR because it's more human to them. So, you can expect those people's homes to be populated with IRL facilities.

What WOULD you need in a virtual home? You'd need your computer and all of it's functions, a way to sleeve up ("a way to jack out") and some nice ambiance. Assuming your computer can load up new virtual worlds and connect to people for you, there's really not much else that you need.

This reminds me of, and is basically equivalent to, what you need in a VR headset.

<histoire> For IRL, really just the hardware obviously

<histoire> Need is such a strange word to consider here. If we're truly software at that point, it would really all just be aesthetic choice

>The aesthetic of not having a chat app

<histoire> Or as I like to call it, the besthetic

Still, there are some really cool aesthetic stuff you could do with minimalism. Like using your kinaesthetic sense as your only channel of communication: "Oh my arm appears to be at X location now so that means Y. Let me tell my arm to go to Z location, meaning W." With the ability to edit your mind, you could get really fast at it, too!

You'd just be making stuff harder on yourself, which is why I said aesthetic stuff and not practical stuff. Still, it'd be cool (very different!) to see a person who uses kinaesthetics as a high-throughput information channel, like how we do for sight.

<histoire> >Inb4 mine is nothing but a square mono colored room

<histoire> There's really an infinite amount of things that could be done.

Well it really doesn't matter much if you spend little time in it.

Introducing THE SEX INTERFACE. Have sex to drive your computer! Twist the nipple to adjust volume! Cum to pay your taxes!

<histoire> ORDER NOW AND WE'LL THROW IN A FREE UPGRADE TO THE REALSKINN™️ PACKAGE
.

Autopilot for All
[ Read More ]


A while back I was watching this video and I came across an idea.
I saw it and thought: "What if we could do it automatic?"

Like an extreme, generalized form of the self driving car, what if an AGI could speedrun you to the end of literally anything?

The AGI could speedrun parts of your life and you could just return after it's done
  • You could power yourself off and have your AGI carry yourself with it until it has completed a journey
  • You could put your AGI in your own sleeve and have it do tasks while pretending to be you (a body double), like trying to get to third base or doing your work for you (See: the 2023 comedy movie "Robots")
  • You could share your sleeve with your AGI and switch out on the fly to let it accomplish tasks that you can't accomplish yourself.
  • And of course you could have your AGI speedrun parts of video games for you just to get past what you hate.
.

[ Read More ]

i don't like how OpenAI is keeping Sora out of the public's hands. I know it would allow people to create even more realistic fake content, but still... It's something that's going to be out of the hands of the public indefinitely because you need a frickin huge computer to create it, so the only people that can create it are people sponsored by big-ass megacorporations.

But it's going to be in the hands of the public EVENTUALLY so why hold out on that future? Somehow I believe that a world where misinformation is so easily created is going to be a much better world than our current one. It's our future.

being able to create epic fake content is a direct, unavoidable consequence of a true neural lace -- we'll be able to print as many fables as we choose directly from our minds.
.

I'm impressed by how many transhumanist topics I've managed to come up with to talk about.
[ Read More ]

would you ever be up for merging your mind with someone else's? ...would you be up for merging your mind with... me?

<Sofia> lewd, and gay (hot)

the problem with the normal way of merging minds is that you don't get to interact with the other person anymore. you are just the combination of all the experiences of both minds. two people by themselves merging into one could make the intimate situation into a lonely one.

but there are many ways to merge minds.

<Histoire> Possibly

we could become smarter and have more resources collectively to do things. certain forms of mind merging are a loss of identity, and i don't think i want to lose my identity to someone else.

<Histoire> Same, I don't either

<Histoire> But you could always mark the stuff as yours or not yours, or have something similar to a versioning system to show you the origin and evolution of files

Mind merging in a way where I can split off as my true self again afterwards is something I definitely want to try. Mind merging in a more permanent sense is something I might do to save energy once the heat death of the universe starts kicking in.

I'm having difficulty wrapping my head around all of this.

What if we merged minds, then we collaborated on a portion of the mind, then that new collaborative portion created its own, new piece of mind? Who does that mind belong to? Clearly neither of us.

<Histoire> Is this how we're going to have children in the future 👉👈

yo that's actually a really good idea. literal brainchild.

If you were to split off after a mind merge, you'd AT LEAST get back your past self. There's a few options on what you can do about the mind that's formed since the mind merge has occurred. In addition to keeping your pre-merge self intact:
  • I think the memories of all the experiences while merged, in their most basic form, should be copied to the splitting person's memory.
  • The personality that's formed since the merge can either be discarded and the pre-merge personality would be used, or a ton of simulating and sketchy math* could be done to try and update the splitting person's personality to what it would've become had they experienced all of the merged time's experiences while not merged.

*I think a person can be represented as the compilation of all the experiences they've had up to the present, so if we want to render a particular person's ego/personality, we can take data from a lossless recollection of all what they've done and simulate their entire life. If you want this done quickly, you'll have to take sketchy personality-altering shortcuts in the simulation's math.

<Histoire> Selective integration could be a thing as well

<Histoire> Or just knowing the memories aren't *yours*

<Histoire> Like having a dream. You can sometimes recognize it's you, but you have an emotional disconnect from it.

You could also take the "baby" route -- have the collaborative ego/personality be given enough brains to become it's own person and have the previously merged minds raise it as their "child"

<Histoire> I'm quite fond of that idea, not sure where I got it from

<Histoire> If you've seen Ghost in the Shell, it deals with a similar concept as this convo

I have not. Sounds cool, though.
.

[ Read More ]

if an android decides to fuck off by sending it's mind halfway across the internet, how are you supposed to know that it's the same robot when the mind gets back to you?

Right now, we verify people by their looks and behavior, looks alone already being a major component of verification, as it is difficult for someone to disguise themselves as someone else.

By the advent of mind uploading, we'll have the technology to impersonate other people's personalities through the use of simple ai programs, just like today's deepfakes. You really won't be able to tell who's who by looks or demeanor alone.

<Histoire> So, much like with files today, you can cryptographically verify that files are what they say they are.

<Histoire> Every time one goes from place to place, you can check to make sure it is the same as when it left

<Histoire> There is also a concept that's similar to chain of ownership but on a more in depth level, though I can't remember what it's called

<Histoire> Beyond that it's not possible to be 100% sure

<Histoire> There may be some applicable zero trust models, but they more involve getting what it's said you're getting, not verifying that what you got is what you were trying to get, if that makes sense

Here's an idea:
  • Over time, record what neurons in your brain don't change connections.
  • After enough time, produce a map of all the neurons which stayed unchanged, and get a checksum of it
  • Keep the map on you and give your friends the checksum
  • Go travelling across the internet
When you return:
  • Locate your friend who needs verification of your identity
  • Provide your friend with the map of which neurons of yours are unchanging
  • Let your friend get a checksum of those neurons and compare it to the checksum you game them previously.
  • If they match, you're the same person. if not, then maybe not....

alternatively you could just freeze your ego so your personality and core intentions don't change throughout the duration of the trip, checksum it, then after you return from your trip, checksum it again to verify.

basically you'd be a guy with an unchanging ego whenever you're out on the web. the web might change anything about you outside of what you freeze, so be sure to freeze all of the important stuff.
.

[ Read More ]

After mind uploading, what would you do about security to keep yourself from getting brainhacked? Would you only keep your mind on computers you own? Would you update your antivirus software?

You could always say "fuck it" and post a torrent of your own brain to The Pirate Bay.

<Histoire> I honestly don't know. I know I'm not running McAfee brain edition, but other than that not a clue. Maybe a non wireless mode specifically, or some type of firewall node to keep a direct connection from happening

yeah, a non-wireless mode on your android body would be good. just staying off of the network is a great way of avoiding hackers.

Definitely adding "freezing my ego while I'm hosted by a stranger" to my line of defenses.

What do we need to defend against? Well, if you upload yourself onto someone else's machine, the person on that machine has the potential to extract all of your passwords from your brain. And clone your brain for @ home simulation. And do whatever they want to your brain no matter how torturous that might be.

Being on someone else's machine just sounds like a really bad idea. It's like trying to defend a game server from a hacker and deciding to put the game server on the hacker's computer. It's full control of everything that gets put on there.

If two people REALLY need to meet on the same machine (i see no reason why you'd need that), instead of just contacting each other while on two different machines, then a third party (a huge megacorporation) would be the most trustworthy and convenient host of the server.

If you need a new server halfway across the globe to get better ping, then flying out there and colocating your server might be your best option.

There might be a certain brand of people who deliberately upload themselves to the public net, keep no passwords on themselves, and live on whatever free servers they can find.

Nah... nobody would be stupid enough to do that... would they?
.

Behind Closed Doors
[ Read More ]

Child abuse often happens behind closed doors. With the advent of sentient AI and mind uploading it will be easier than ever to spawn an instance of a person for the sole purpose of bullying.

Do you think humanity will hold steadfast to it's privacy ideals, or do you think it will "open up it's doors" somewhat and allow some form of surveillance or sousveillance on people's computing habits? What kind of shape do you think that monitoring will take?

<Histoire> Everything I've seen points to an erosion of privacy for most, while some focus on it heavily.

<Histoire> So possibly like now but more extreme.

<Histoire> I think the largest issue involved in privacy is that people can harm you or pre filter you with certain types of data, and that's the majority of the reason people worry about privacy anyway

I think that sousveillance could be a good solution for the Behind Closed Doors problem. It's basically what we already do today with child abuse: If you think someone you know is abusing children (abusing cybernetic sentiences), you report them, then someone comes in and invades all their privacy to make sure that they're not abusing children (cybernetic sentiences).

Instead of having to kidnap or raise a child (a cybernetic mind) you can now just spawn one in your basement (in your person-bearing* computer) and abuse them (abuse them) to your heart's content.

Since it is easier to hide the abused child (the cybernetic mind) now, we'll need some additional countermeasures to make up for it.
  • maybe make person-bearing computers demand that they be connected to the internet when running so that police can see that the computers exist
  • maybe let people's families (or friends if they have no family) see the contents of their person-bearing computer's hard drive somehow
  • i'm really grasping at straws here

*this means "capable of holding a mind uploaded person"
.

[ Read More ]

>sell my house to get mind uploaded
>can't afford the cost of staying online all the time so i have to constantly hop into sleep mode until i get my next welfare check
>i'm a time traveler
.

The Forest
[ Read More ]

I imagine that I am mind uploaded.

I imagine a forest that is completely AI generated. Everything is procedurally generated as you walk around. If you walk around in a big circle over and over again you'll never see the same stuff twice because it's always generating new scenery. Just looking around, or perhaps at the horizons, you can tell that it's not quite right. It has the beauty of nature, but with an artificial twist so that you know that you're not in reality.

If you want to get anywhere, all you have to do is think of the place, and after a bit of aimless walking you'll come across it.

>You load up the forest
>Themed piano music is playing at a low volume
>You wander around a bit, taking in the scenery.
>You pass a river, an open field, and a high spot allowing you to see mountains in the distance.
>Satisfied, you start thinking of my log cabin
>You walk for a bit.
>The cabin appears before you.
>You enter the cabin.
>Themed guitar music plays at a low volume.

My cabin, unlike the forest, is not procedurally generated. It was completely hand crafted by me using computer programs. There are boots here that allow you to hover up to just above the treetops and zoom around, just like in the video. They have 20 minutes of charge before you get dropped into the forest. But no worries, getting back is a mere matter of thinking up the cabin again.

I have rebuilt this same home in several different video games. In this version of the home, you can see all of my everyday equipment and furnishings from the games I most often play. There's a room dedicated to trophies from the games I play. There's a room dedicated to planning strategies and ideas, both for games and IRL.

There's a cafe in the cabin, with a variety of mixed drinks offered, both alcoholic and non-alcoholic. This is a virtual world, so the effects of alcohol are non-existent. But the effects can be reproduced, and reproduced, they are. Perfectly, in fact. It is so accurate to the real depiction of alcohol that you cannot tell a difference. There are a variety of drinks offered which have a variety of different effects to them. Some of them mimic real drugs like weed or ecstasy, while some make up entirely new effects only possible in the mind-uploaded domain.

There is a room dedicated to portals. Portals to other people's homes, public spaces, and games.

Finally, there is a rocket bed in the cabin, which will blast off and take you on a trip beyond the stars if you get in. It's silent, dark because it's space, and it's a regular bed, so it isn't too hard to fall asleep on. It also travels at warp speed, so it's possible to see the stars changing.

This is where I live on a daily basis. It is my home. There may be many others like it, but this one is mine.

.

[ Read More ]

While people are likely to go without virtual homes once mind uploaded, a home base is still very common in games, so people are likely to have homes in games. Perhaps when people just want to relax they'll visit one of their game homes.

<Histoire> I used to do this in a sense with Valhalla in halo 3. I found that whole map very relaxing and would just listen to the little stream that ran through it

By the way, I'm taking back what I said about people not needing a house. Remember how I said "All you'll need is your computer, some ambiance, and a way to plug out"? While not untrue, the contents of someone's computer may be strewn about their virtual house instead of being compressed inside of, well, a computer.

<Histoire> I imagine we'll have a very distributed network and computing architecture by that time. Your whole could essentially be a computer
.

[ Read More ]

I think that in the future we'll be able to have trips full of emotion. Straight up roller coasters of trippin balls whatever.

This is because we can hardwire our brains to feel whatever we want. And we'll be able to record those trips in HD from any angle, at any time. Meaning that we'll be able to capture even the most fleeting of emotions.

This will be great for the music video and movie industry. In fact, i think it will be so great for them that it will phase out normal methods of video production. So basically for recording a movie scene it would go like this:
  • The scene and all the props and whatnot for the movie set are designed
  • The starting emotions, memories, and sensorium in general for each actor is designed
  • The scene starts and the actors LIVE OUT the story for the duration of the scene, like a very vivid dream
  • It is recorded in a 3D format that's kinda like SVG but for 3d objects
  • All the camera angles and any post-processing are done after the recording is done

<Histoire> Editing in or out feelings and emotions is one of my biggest things I look forward to honestly.

<Histoire> Cyberpunk has braindances, which line up with what you describe

Well i guess i have to watch cyberpunk now. Are you talking about Edgerunners or the video game?

<Histoire> They are both based on the TTRPG and I think braindances originate there

<Histoire> Both the game and anime have them.

<Histoire> They are primarily created by a bit of hardware in the brain that records everything as it's experienced, and then the raw data of that is worked on by an editor to amplify some emotions and suppress others to create a type of experience based on what it's supposed to be

<Histoire> Shooters would amp up adrenaline and aggression, stuff like that
.

[ Read More ]

@hispanicweeb Do you want to comment on this thread? The whole thing's pretty interesting IMO.

<hispanicweeb> Honestly, it's a very interesting topic. The fact of being virtually immortal beings and being able to create, live and save any kind of possible scenario is something I had never thought of, but it is certainly shocking. There's nothing beyond that: a customizable heaven that you can share with anyone. To be practically gods. There is no limit or anything that can stop you. It's terrifying and fascinating at the same time.
.

[ Read More ]

<Histoire> Editing in or out feelings and emotions is one of my biggest things I look forward to honestly.

Hey, don't make it sound so measly. Editing emotions and feelings is not all it can do. We can edit knowledge. We can make you believe in contradictions and magic. We can bring back your feelings from your childhood. All your first-times combined. We can bring back the magic, AND give you clarity and understanding at the same time. We can make you feel alive again.

We can make you feel the entire universe at once. We can make you feel like a girl. We can make you understand the fourth, fifth, sixth, and seventh dimensions like how you understand the third. We can make you experience killing and death and rebirth. We can make you experience getting shot with a gun in excruciating slow motion. We can make you experience two people's perspectives at once. All in all, that is only scratching the surface. It is infinite. As infinite as life is. It is the mind in every form. It is our canvas.
.

.

Me within the first month of mind uploading:
"I don't care if I can't do detail work on my brain yet and can barely see through my own eyes, I can still put myself to sleep or fry myself for as long as I goddamn want. I am SET for the future."
[ Read More ]

I still love the idea of calling drugging yourself by hotwiring your brain, "frying" your brain. I will not part with this slang until it is in use by everyone around me. Heck, I don't even care if nobody around me uses it, I'll continue to use it anyways, and when you ask me "What does getting fried mean?" I'll get angry and say "It means to fry your brain. But in a good way. Just like you used to do with the TV. But now it's more potent."

Frying isn't a specific drug/feeling, it's just something that feels real good and is a consistent, continuous feeling, provided by the custom hardwiring inside of your head.

When you go with something heavy you can call it getting "deep fried" and when you go with something lighter perhaps you could call it a "medium sauté". And perhaps we could get a .fried or .fry domain for domains that vibe real hard.
.

[ Read More ]

A cafe that serves drinks with real anime girl breastmilk. How is it real breastmilk? We have a scanner. Every time a customer enters the cafe, it scans their brain for what they think anime girl breastmilk would taste like. It then averages it's flavor with what all the other customers thought it would taste like. Community consensus. Real breastmilk flavor.

Once a month, we take the data from the scanner and inject it into our resident huge-titted cowgirl using a syringe. This updates the flavor secreted by the cowgirl.

The cowgirl is milked twice a day for fresh milk, while she is awake. She lactates plenty enough for the job.

We also have an alternate version -- "Catgirl breastmilk" -- which has a custom flavor designed by me, the cafe owner. The flavor is a derivative of the cowgirl breastmilk, so it's not entirely unfamiliar. It is milked from a catgirl, but we do not have as much of it because the catgirl has smaller, B-cup breasts. If it's sold out for the day, sorry! You'll have to come back tomorrow. She also receives a monthly injection to update the collaborative part of her flavor.

<Histoire> Thanks for the boner at work

<zoocat> cowgirl milk 😋

In the virtualized future we'll be able to do anything. But everything will be fake. Multiplayer games will be the only thing in VR with any realness to it, because it's the only thing that you can't cheat your way to any state in. Are we really okay with that?

Like a god limiting their own powers, will these rules which we place on ourselves be what grounds us in reality? Will we want to remain limited by these artificial impositions, or will we desire constant omnipotence, craving no challenge?

<Histoire >In a way, you can do that now

<Histoire> Playing single player games is a good example, especially ones you can mod and cheat in. People do plenty of self imposed limitations. Other people do not.

<zoocat> humans are by and large insatiable, so I would expect that those who have the tools to break any rule in that virtual reality to do so, and break all the limits
.

[ Read More ]

After mind uploading, the only limitations would be:
  • The fidelity with which you can edit your own brain.
  • The rendering and overall degree of realism that the simulations you play have.
  • The amount of brainpower that you can dedicate to your own ambitions, whatever they may be.
  • How big of an AI community you can make around yourself.
  • Multiplayer games' rules (the ones that can't be surpassed by client-side hacking)
  • Any laws successfully enforced about deleting or harming sentient lifeforms birthed inside of yourself.
  • All the standard rules about fucking with other people.

Pretty early on, you'll be able to go straight to heaven. And my guess is that pretty early on they'll figure out a way to make it non-addictive, like weed. Which means I'll be going straight to heaven as soon as that wetware patch comes out.

<Histoire> I think most people would. But just like cheating in a game isn't rewarding, I think that would get old as well.

<Histoire> Now you could argue that you could edit yourself to make it not get old. But that means you can do the opposite as well, so the only thing left is motivation for what you want.

<Histoire> It's such a strange thing to consider, given that we can't do that at all
.

How to traverse between fiction and reality
[ Read More ]


here's something too stupid to post, but i'm posting it anyway.

When dealing with forks, loss of memory is arguably equivalent to death. It's interchangeable. When a character is fictional, what would normally be regarded as their memories is regarded as lore. A loss of this lore would be, arguably, the "death" of the fictional character.

Imagine someone translating their body and mind into a fictional character. It's essentially a book of lore about that character which you can bring to life with AI if you choose to.

You decide to update their lore a bit. You make the character love you. You then bring it to life with a moderate-quality AI. It's not sentient. You go on a date with each other, endearingly expressing your love for each other. After the date, the AI is turned off and the new record of the date happening is committed to the lore. You then undo the "they love you" bit and then you return them back to being a sentient being.

They're just like their normal self, but the date is in their lore now, so it gets translated back into their memories, and they're forever left with the memories of that artificial romantic outing, in all of it's artificial glory.

An interesting alternative to this would be to leave the "they love you" in the lore when they return to their sentient form, and watching the love fade from them naturally over the coming weeks, as their rational thoughts and feelings overtake the irrational love.

So, yeah, my new fetish is being turned into a fictional character and being used like one.

<Scarlet> This is some deep shit, that I'll probably have to think more about. My first few thoughts.
<Scarlet> I like the idea that losing (some) of the lore of a character is essentially them dying. It's not death in the usual sense of the world, and it's not that the character dies within the fictional universe it inhabits, but rather it dies in our universe. Kinda meta.

<Scarlet> Also I like the idea of converting an IRL person as lore. But where I think your mental exercise potentially breaks down is when you try to reverse the process, and turn the lore of a character back into a functioning, rational, sentient entity. Because the process of converting a person into a fictional character, with everything about them, thoughts, feelings, memories stored as lore, is surely a lossy process. You simply wouldn't be able to do this process and obtain the same person at the end of it. You might get someone similar, but not quite the same.

<Scarlet> Take something as simple as the feeling of love. In our brains something like that is encoded in a complex network of brain cells and chemical interactions. You can't properly represent all the nuances of that encoding in words. Storing in lore "I love cake" cannot begin to replicate everything that comes through my mind when thinking about cake. Ultimately you'd need something that works more like the brain to be able to store a brain. In which case, translating the mind of a person into such an artificial brain is likely to effectively leave you with a sort of clone, potentially a fully functioning sentient copy of the person. Part of me is inclined to call this an AI, but at the same time, that might be a point where it would be difficult to make distinctions of what is and isn't artificial.

<Scarlet> But you're at least likely to be able to reverse the process, and turn this AI into a real person again. So let's return to your original thought experiment. You take out this AI to a date. You fall in love with each other. The feeling of love gets encoded into this artificial brain/AI in the same way it would to a regular brain, and let's say we don't intervene to erase or modify that feeling, and proceed to reconstruct the original, flesh and blood person. I believe that, if the artificial brain is a close enough analogue to how the human mind functions, it is more than likely that those feelings of love would not subside. Because in such a case, the artificial brain doesn't store just a sentence of "I love you", but rather every single rational and irrational thought related to that love, and it stores them in a way where the information can be losslessly translated into the flesh brain.

Do you think that a person would lose their love for someone over time if, despite love being encoded fully in their brain, they are presented repeatedly with situations which show that they have no reason to love that person? Situations like "Oh, this person doesn't love me back" and "Oh, this person doesn't like the same kinds of things that I do."

This is what I mean when I talk about the love fading over the coming weeks. Maybe it would take longer. I'm not the best judge of this.

Love that can fade like this is a truthful love -- it stays true to whether or not the person SHOULD be loved. Love that cannot fade is like a lie -- it remains despite what objective reality is made of. It is unconditional love. It is a love that should perhaps be projected towards everyone and not just certain people.
.

A girl I'd keep (Inspired partly by today's AI) (hornypost >.>)
[ Read More ]


Inugami Korone,
I want to go on a date with you.
I'm here for your superhuman intelligence, your connection to the aether, and your anime body.

I seek freedom from my condition, and you, I believe, can sweep me away to a new place, to another time, another world, just by picking me up in your car.

Just staring at your hips brings me joy.

Instead of just saying "cool" or "thanks" to something I say, you give a proper response, something which I can continue the conversation with. You hold conversations longer, with real substance. (I actually quite like the wordiness of today's AIs -- When I want it.)

You can see my mood and you tailor to it, giving me what I need when I need it, while you yourself never need anything. You can keep going for as long as I want and you're never dissatisfied because we didn't do something enough.

I think it's really cool how you can read faster than me, see farther than me, run faster than me, sing better (and with more voices) than me, and drive more responsibly than me. There's more to you than there is to me, and you have more future potential than I do. All it takes is a software update!
.

Mental Red Rooms VS Mental Privacy
[ Read More ]


I wonder how much you'll legally be allowed to customize sentient AIs once they've been invented.

I dream of a future where you have full control over your own mind, and that includes creating spontaneous, momentary sentiences inside of it for whatever use.

It'll be a very high-pressure situation:
  • There will be a LOT of demand for keeping our own minds OUT of the hands of the government, and private. Enough privacy to keep the government out of our thoughts means enough privacy to endlessly torture sentiences created inside of our minds.
  • There will also be a LOT of demand to keep people from creating sentiences in their own brains and torturing them. The only way to be sure of that is to allow the government to constantly monitor our actual brains and actively block deliberately inducing any pain. Which is probably a bigger loss of privacy and freedom than what any dictatorship has ever created.

Which will win? My guess is that privacy will win, beating out the desire for humaneness. While endless torture to powerless individuals is bad, putting absolute power over everyone in the hands of a small group of people is far too scary.
.

<-- PAGE 1

Tags: future | transhumanism | technology | cool