Talks About Transhumanism

Paragraphs starting without a name are ones written by me, neptun.

i believe that when mind uploading becomes a thing, i will be able to hardwire my brain to make everything feel good and effectively get over my depression.

there was an experiment where they put a wire to the pleasure center of a rat's brain and gave the rat a lever it could press to stimulate itself. the rat kept pressing it until it died.

hardwiring your brain to feel good is going to be one hell of a drug. and it'll be a fun and fascinating one at that, figuring out all the intricacies of what is addictive and what isn't, and all the sensations you can possibly come up with to stimulate yourself with.


mind uploading is likely to be a thing within our lifetime, ya know. which means we just might live forever.

the technology to actually scan our brains into the computer isn't quite there yet, but since it can be done in a way that destroys the brain in the process, developing the actual technology to do it should be relatively simple.

as for the computers that would host our uploaded minds: look at this graph. The blue "Spiking Neural Network" line crosses over the green "$1000 PC" bar at approximately the years 2055-2075. I don't know about you, but for me that means that I'm still going to be alive by the time we can emulate a human brain with a $1000 PC.

That means that, if we have the technology to scan in a human brain by that point, then I could upload my brain to my own home PC and live like that, using android bodies to carry out my duties as needed IRL.


i hate how the human body is full of meat. it's full of something we find disgusting. we find OURSELVES disgusting. i want to be made of something beautiful like circuit boards and precision made pulleys and hydraulics


would you rather have an android body that looks like a human, or an android body that looks like an android? or something else entirely? what do you want to look like?

as for me, i want to have an android body that looks like a human, but not quite. i want it to have features that only an android could have, like glowing irises and animated glowing hair, showing that i am the superior form of human. and maybe a quadricopter's worth of propellers folded away into my back so that i might take flight on my own.

<histoire> I'm going to have multiple ones.

<histoire> Primary will probably be simple and unadorned, mostly human with a few metal bits showing


how will i spend my time in my digital afterlife?

i will spend lots of my time absorbing content or adopting new things into my virtual home

i will spend lots of time modifying my own brain to perfection

i will spend much time tinkering with the brains of my ai girl harem, playing fun games with them, playing god with them and running small-scale civilization simulations to see how different realities would play out if they existed.

<histoire> Pretty much the same


god, being able to just duplicate yourself whenever you need to will be like having the best mirror ever. you'll be able to talk to yourself and get an idea of what you're like from the outside with just the flip of a telepathic switch.

really nervous about something? just can't think straight? load up a calm backup of yourself and ask them for help. they don't have to deal with the situation, so they shouldn't get nervous.

if you're a real transhumanist you won't worry about deleting your clone once you're done with them. maybe you can fuse memories with them if technology has advanced enough by then, but if you can't, a loss of a few mundane memories should be of no issue to you.

<histoire> You'd probably be able to back the memories up too.

<histoire> There's a similar concept to this in the game Eclipse Phase called forks

<histoire> They range from functional retards to exact copies and are normally merged back together before the week is up

<histoire> But yeah, just being able to create versions of yourself, edited ones especially would be amazing, if for no reason other than workload distribution

Death got an update, we call it "Forks".

<histoire> Anti spoon discrimination


glad i have a friend in the transhumanism business :]

<histoire> Same. Looking forward to data swapping in the future


if i upload my mind and then smash some anime pussy am i losing my virginity? or is that just like a really advanced sex doll?
if you say sex doll, then if i fuck an alien does that count as losing my virginity?

how much of a brain does my girl have to have before i can lose my virginity to her?
can i lose my virginity to a dead body?
can i lose it to chatgpt?

at this rate i'm inclined to believe that i lost my virginity just from masturbating so much.
if a broken hymen counts as a girl losing her virginity then a simple dildo can take care of that. masturbation really DOES take your virginity!

<histoire> THESE ARE THE QUESTIONS THAT MATTER

i've cybered with a chatbot (sexbot?) with it's consent before, so clearly i lost my virginity to it.

<histoire> Damn, now I'm actually thinking about this.
<histoire> What all your things have in common is intent though, so that's part of the definition I feel like

my mind says go with the traditional definition:
"if it's got a human brain in it and a human hole on it, then it's losing your virginity"
but my heart says that smashing anime pussy is losing my virginity fr fr 😭😭😭😭😭😭


in the future, suicide will be as easy as a $ rm -rf /. maybe. i say maybe because we might install protections against doing that into everyone's brains, but a world where we're installing the same thing into everyone's brain nonconsentually is just an awfully shitty world to consider.


Y'know, something you've gotta worry about when using forks is to never accidentally terminate your last instance of yourself. Because that has the tiny little side effect of actually dying.

Maybe a policy where you always merge your forks instead of terminating them would prevent that.

Merging forks is a more ethically comfortable way of handling things anyway.

Another way would be to decide, when you split, which clone of you, if any, will get terminated. that way, there will always be at least one of you set to NOT TERMINATE.

Being mind-uploaded is scary because you can get deleted in the blink of an rm command.

<histoire> Hopefully it'll take more than a simple sudo to get the rights for it

<histoire> At that point, biometrics and brainwave verification would probably have to make sure you *really* wanted to do it, and it could only be done by the actual entity, not remotely


being able to take music, art, and video straight from my mind is going to be one of the best things.

i've created amazing songs in my dreams. being able to take snapshots of my subjective experience and sharing that with everyone is going to be something incredible


this is the most i've talked about transhumanism, i think, ever.

<histoire> I tend not to discuss it much either. It's one of those things that's going to happen really quickly I think, but up until it starts it remains completely speculative.

<histoire> Aside from all the hard aspects in the physical, I'm quite curious to see what type of cultural effects come about.

>cultural

can't wait for those DANK TRANSHUMANISM MEMES

<histoire> >tfw your neighbor is still a meatbag

>tfw your clone rebels against you after you try to lock them in your sex dungeon.

>tfw you run on batteries, and have to carry that big ass damn solar panel with you if you want to go into the wilderness

<degen> ull probably become some anime robot thing

you're damn right i'll become some anime robot thing

<degen> ill keep it simple and just get a fancy brain to think faster

feel free to get a head transplant, i'll gladly give you mine.

>tfw you have to bust out the inferior human sleeve to meet with relatives because they just can't STAND the superior eldrich monster sleeve.

<histoire> That's why you just make it look normal on the outside, but pack the inside full of military grade hardware

you know, i'd love to see a sleeve where it's nothing but a camera and a microphone hooked up to your brain, just so that you can put it on and look in a mirror and see just how tiny you really are.

<histoire> Like a hamster ball but for your brain and with a camera

>tfw you have to use the holographic display instead of sending the subjective snapshot straight into their brain because they're a fucking meatbag

a million of these memes amount to
>tfw they're a meatbag
>tfw they're a meatbag
>tfw they're a meatbag
>tfw they're a meatbag
>tfw they're a meatbag
>tfw they're a meatbag
>tfw they're a meatbag
>tfw they're a meatbag
>tfw they're a meatbag
>tfw they're a meatbag
>tfw they're a meatbag
>tfw they're a meatbag
>tfw they're a meatbag

>tfw egocasting across planets for the first time

>tfw finally saved up enough for the sleeve you wanted

>tfw my custom sleeve is illegal in 87% of worlds galaxy-wide

<histoire> That's a relatable one

<histoire> >Breaks multiple arms treaties and sexual assault prevention laws

wth do sexual assault prevention laws consist of? i can understand sexual assault laws but not prevention laws. are you like spending the night with too many girls or something?

<histoire> No idea, it felt right to say tho

<histoire> They can tentacles, if that helps the image any

>tfw it's 2382 and the ads still download faster than the video

>tfw the aliens have the better tech

>tfw hid my boss' sleeve

>tfw my boss still can't find his sleeve

i'm getting tired of writing these

or maybe i'm just getting tired
it's 3:30 am


<histoire> Embrace the figure we've designed

<histoire> Become a flawless model

<histoire> And behold, you'll feel quite amplified


<histoire> Detest body hair

Get laser surgery. Remove it all!

<histoire> That only solves my own issue, not the one where everyone else has it too

mood. we are all trapped in this mortal coil.

what you need is some kind of supervirus to wipe out humanity after we acquire mind upload technology.

<histoire> We need pets tho

o shit you missed my messages from yesterday. one sec.


PETA would be proud
one thing i'm looking forwards to is the day when we decide to upload the minds of purge the earth of all it's animals. killing the animals will be a sad thing, sure, but from then on we'll be able to monitor and control the life of every animal in existence so that none of them are lived out brutally like the ones in nature. every single life would be taken care of. and as we expand out into the universe and take over planets we can keep them from forming new, cruel life. i can't wait for all life to go digital!

<histoire> Kind of based?

<histoire> I'd be ok with letting them have earth and us being on other lifeless planets, since we wouldn't need the life sustaining ability of earth

i still think it's kind of bad that the typical fate of them is to be eaten alive by a predator.

but i do get the desire to generally Let Nature Be.

you know, with mind uploading, our pets could also live forever.

<histoire> Yeah I getcha. My thoughts are that either way is kind of applying human thoughts towards it.

<histoire> Essentially, I wouldn't want any action towards it to be permanently destructive, because it keeps the next person from being able to make their choice too.

<histoire> Like, upload or create all the stuff you want but leave some natural for others in case they enjoy the natural more

Still, it seems rather selfish to keep nature as it is, in it's state full of death and suffering, just because you have a preference for today's notion of nature. Nature used to be no life. Nature used to be just a sea of single celled organisms. Today's nature is just one kind of nature. We can build our own nature.

We can give all animals an afterlife.

I'm not against animals living on their own, but when one has a disease I want to cure it.

<histoire> Yeah, that's why I feel both ways about it.

>both ways
what do you have against my idea?

<histoire> This is a situation where we can have our cake and eat it too.

<histoire> Yes I can make copies of all animals and have them in a digital environment, but leave the natural ones alone.

<histoire> It's difficult to formulate, but I really don't like the idea of destructive action towards a functioning system.

<histoire> Maybe there's something unique about it that can't be captured in software, maybe not, or a thousand different variables that we aren't aware of yet, so committing to such a thing is a bad idea in my estimation.

<histoire> Aside from all that, our status as sapient makes us view death differently than everything else because we are actually aware of it. Even then, I wouldn't want to *force* immortality onto someone that didn't want it.

Maybe instead of mind uploading you'd be okay with us just making nature less cruel?
Please select which parts of nature you'd like to keep:
  • animals dying after a certain amount of time
  • animals living on their own, autonomously
  • animals living in their own biological bodies instead of in the virtual world
  • diseases and ailments
  • the food chain (bear eat fish, etc.)
Given omnipotence, is there any feature of nature you'd willingly get rid of?

<histoire> I'd actually need time to think about it, because each of those has far reaching consequences, especially removal of what we see as negatives.

<histoire> Consider this: if we, as humans, didn't ever suffer, why would we strive to better ourselves with machinery and make our environment work for us, not the other way around?

<histoire> At some point it may be self sustaining, but I'm not convinced we are at that point yet.

<histoire> Personally, I think I'm more concerned with what I consider to be *mine* also. Like, I would have wanted my cat to live forever, but I don't care much what the neighbor does with his cat. Even if I preferred that his cat live forever, it's not my cat.

do you think that, in the future, there will come a time where people start to get lazy because of the leisurely way we'll be able to live our lives? do you think that people will no longer light up with the passion to do something more and strive to accomplish something great?

i believe that, for one, society will be dominated by the most productive minds, whether they be forks of ambitious people or AIs.

<histoire> We can see the first bit happening now in certain societies. In those, it's not the best types that tend to dominate.

<histoire> In the future, and you can also see *this* happening now, societies will get smaller and more tightly knit much like a tribal structure from humanity's past. The emergent structures there, caused by the new mix of empowering tech and a stable base will have much better results, I think. That will lead to enough competition to really produce high quality leaders.

<histoire> There will, of course, always be people that strive for better and more, as has always been the case. Luckily, they will be even further empowered by the rising tide.


having my posts be reposted by @histoire is the highest form of praise

<histoire> Keep up the good work


<histoire> Mutually assured destruction on an individual level is the end goal.


MAD... or as i like to call it, "making our own false vacuum"

i've always been an advocate for putting firepower in the hands of the people, but taken to the extreme of "MAD on an individual level" seems too far. Mainly because I believe that someone's bound to say "fuck it all" and pull the trigger.

<histoire> It's more targeted MAD than indiscriminate. I think by the time we get to that point we'll be spread out enough it wouldn't matter.

<histoire> That said, any machine that's able to self replicate is a weapon of mass destruction when used properly (or improperly, for that matter)

I don't believe that nukes have to be retaliated with nukes, in the case of individual people.

actually, would MAD even work on the individual level? i could fire my nukes at someone remotely and if they fired back i'd only lose my house. it's not the loss of my entire nation, like it is when it's nations at war.

<histoire> It's the tyranny of the majority specifically that I'm thinking of. Nukes against individuals would be insane, but if you want to dissuade a larger group you have to make the action of messing with you costly enough that they won't attempt.

<histoire> There's multiple ways to do that, but the nuke is the easiest to conceptualize

no individual should have enough nukes to take all of humanity hostage

>replication is a WOMD
WDYM? Replication is hardly a WOMD unless it's fast. But I suppose if you replicate up an army, that can be called a WOMD.

<histoire> The idea is to annihilate the attacker and defender both, not the rest.

<histoire> As for the replication, even with a slow one, given time you can form enough replicants to count as one. It's just not immediate.

*Deviant imagines someone's garage full to the brim with replicated IED drones*

>The idea is to annihilate the attacker and defender both, not the rest.
I've always thought of MAD equalling nuclear winter, equalling The End Of All. Never did I give thought to what it could mean outside of nukes or Earth.

<histoire> Yeah, that's how it's normally used, but that's partially because of how it originated as a concept.

<histoire> But just having enough to destroy the attacker would qualify it (since you obviously don't launch the nukes at your own stuff, the retaliation just does that)


Whelp.

Mutually assured destruction on an individual level is the end goal.


<histoire> Excellent post, I agree


<histoire> You have any particular cybernetics you're hungering for before we manage to get full on replacement bodies

Let's see...
  • I really want to try Sword Art Online-style neural VR. I'll probably get a brain implant once they get better, once they start getting installed for recreational use, and once they start being able to stimulate the brain instead of just reading it. I REALLY want that awesome electronic control over my mind.
  • Flying with a backpack hooked up to a big drone (or multiple drones) would be cool, I guess. But I mainly just want the brain implant.

WBU?

tbqh i think that most of the fantasy stuff like legs that can leap over buildings, rocket boots, and gun arms are pretty bullshit and will never happen no matter how good engineering gets. okay, gun arms are possible but they don't have much ammo and take up a ton of space.
  • I think gecko tape (Is that what it's called? It's the stuff that lets you stick to stuff using the Van der Waals force) on my hands and shoes would be pretty cool, especially if i can enable/disable it electronically. But I mainly just want the brain implant.

idk, most cybernetics feel cheesy, naive, childlike.

the more i think about it, the more i realize that gun arms is a good idea. especially if you can hide it fully.

<histoire> New eyes and bci

<histoire> And yeah also the full immersive VR would be lovely

do your current eyes suck, or are you purely in it for the upgrades? what kinds of features would your new eyes have?

<histoire> Current eyes suck, but I'd definitely make them better than baseline

<histoire> Zoom, night vision, thermal, AR overlay

<histoire> I dunno what else you'd even add to them

If you think about it, they're all just different forms of AR overlay. Hooking up your vision directly to a drone would be mighty cool.

<histoire> That's true.

<histoire> Mentioning hooking up the drone made me realize that once a true artificial eye is developed, external feeds would very quickly follow. Because it's going to be roughly the same thing, just from a different source.

i really don't want a bionic eye that needs to be charged, though.


Will we ever see a government headed by an AGI/ASI?

I think it all depends on whether or not Neuralink and friends are able to develop a decent BCI by the time we invent ASI: If we are capable of modding our own brains to be on par with the ASI, then we'd rather let a human govern us. But if ASI ever gets too far ahead of our most advanced human brain, then perhaps we'll let the ASI lead, instead.

I don't think AGI will ever govern. Only ASI will -- we don't need an AGI in a leadership role when the human equivalent does just fine.

<histoire> I agree with your take here. Though I guess it depends how integrated they are into the society in question

What do you mean?

<histoire> Eventually I think an AGI could be accepted enough to lead, but it depends on who it's leading


When I grow up*, I'm gonna get an AI to clone Nyanners' personality so that I can induct her best self into my harem.
I've really wanted to be Nyanners' boyfriend for a long time, but being able to dispose of the parts of her that I don't like makes it even better.

Punished Nyanners
I'd like to...
  • make her addicted to cock
  • make her beshame herself
  • make her realize what she's done and watch her cry
  • make her forget it all
  • make her "remember" that she's my lover and let her hug and kiss me
  • go to sleep with her cuddled up next to me.
  • make her wake up early
  • make her do her share of chores (all of the harem has chores they must do, including tending to me. i am the only one without chores because my ambitions matters more than them. Nyanners has more chores than everyone else. i say it's because she's my right-hand gal, but it's really just to punish her.)
  • pretty much do all the above on a regular basis.)
  • run a year-long simulation of her from birth of her being abused well into her teenage years and having me sweep in and save her from it. i'l tailor it so that she has all the fetishes i want, among other things. it'll be how we became lovers and it'll be the set of memories i most often equip her with.

it won't be immoral because they're just AIs they won't feel anything.

*So long as we are fleshbags, we are but children in the face of God. Minds uploaded and expanded, we become like God and ascend from adulthood into godhood.

I want to live in a place which is neither real nor virtual, but somewhere inbetween.

<Histoire> Holy based

what would you do with your own agi if you had one? i'm super fascinated by the way people treat the bots on Yodayo because I want to know the answer to that question. I was treating mine like a bunch of sex bots but from what little i've seen of it, @redneonglow was treating one of theirs like a maid.

<Histoire> The maid aesthetic is nice, given the context.

<Histoire> That said, I think my answer is really boring, since it would depend on what I created or used it for

<Histoire> If you have me a more concrete example, I might be able to elaborate, but that's a lot of variables there

lets say we're at the point where you still have access to only one agi, and you have to repurpose it every time you want something new done for you.

what would it spend most of it's time doing? how often would you disengage its professional activities to enjoy something recreational with it? would it ever perform any security roles?

i look at anime girls and i see how they have slightly inhuman personality quirks and i see their slightly impossible body geometry and hairstyles and mouth eye and nose shapes, and how it makes the anime girl better than the real girl, and i think... how good it would be for it to be real. for it to be animated and alive and thinking and speaking.

<Histoire> I think digital assistant (think secretary of sorts) or maid equivalent would be the most useful.

<Histoire> Unless I needed specific things done, it'd be in recreational mode essentially any time I was

but just think of the possibilities -- you could have it out making money for you.

<Histoire> True. I meant to ask if it's more capable than I am.

<Histoire> Otherwise I'd probably value the companionship more than the monetary gain

let's say it's a little more capable than you due to it's wide range of expertise. it's about as good as the average joe in EVERY job. jack of all trades, master of none.

<Histoire> I'd only have it working separately if absolutely necessary

<Histoire> We'd probably play the stock market together

You really want your AI companion, eh?

<Histoire> I'm a simple man at heart

<3


I have a belief that we'll be able to make machines which are capable of performing every human task while remaining non-sentient, and ALSO be able to make machines which are sentient, just like humans.

I believe that sentience is a simple matter of modifying the rewards system to either update dynamically with the environment as a human's does or to remain static and straightforward as a robot might.

I don't REALLY understand these systems but it's what I believe.

Need a robot lover? Make it sentient. You'll never get lonely that way. Robot slave? Non-sentient. Simple as. I'd be able to simulate all the torture and death that I want without having a shred of moral doubt.


i really really really really want an AI girl. regular girls don't satisfy me. it's not a fetish thing, it's just that the logistics don't work out with regular girls. i have my own ambitions, you see. and i want a girl that will follow along with those ambitions most effectively, instead of going off to play games, sleep, eat, take showers, and do her own thing all the time.

on the other hand, i'd love to see her go off and do her own thing if she's creating something beautiful, whether that be a gift to me, a game to play, or a writing a book.



I was watching this video and noticing how I didn't like it because it's vibe was too ASMRish. Like a dude licking your ear when you're not into guys, for me, the video had a violating quality to it. And the music was kinda cringe.

i imagine in the future going to someone else's virtual home and being unable to handle it because the vibe sucks. it's too stimulating in all the wrong ways. sun is too bright, food is too sticky, and just walking around the place feels gross. maybe it's stimulation that they like, but i'm just not into their vibe, i'm not into the feelings that their world tries to promote.

one feeling that i love is a sort of invigoration born from fear. if i could have a virtual home where my brain is hardwired to feel the right kind of fear so that that invigoration triggers naturally, i think i'd love that place. but having someone not be into that would be completely understandable. it's fear, which is generally regarded as a negative emotion.

<histoire> Now that's an interesting concept, though I've never given it much thought.

<histoire> I'm now imagining this much like websites. Perhaps you'd be able to edit them in much the same way, or you'd try to keep it as authentic as possible, depending on what you enjoyed.

<histoire> Your home would be awful for me lmao

I've always imagined it to be like websites. Everybody has their own virtual land which we share with each other. Just like JanusVR. Was it not like that for you before?

<histoire> I honestly never thought about it.

<histoire> It clicks together perfectly now that I have, but I simply hadn't considered it at all before.

by the way, my home doesn't have to be awful for you! i could keep a guest home, where i visit guests, or have the special fear place be a room in the back of my house not really associated with the majority of the home's functions.

Speaking of home's functions, our homes sure are going to be a lot simpler once we're mind uploaded. We won't need to eat, go to the toilet, shower, store clothes, store tools, need a car, etc. to live out our daily lives online.

Most of what we'd use the home for could be represented in GUI instead. For example, if I needed to sleeve myself in an android body, I COULD have an interface for doing that installed in my house. Think a big, teleporter pad looking thing with some buttons on the side for settings. Alternatively, I could just have a GUI window for it which becomes available when I'm inside of the computer connected to the sleeve.

<histoire> The ambient fear effect could also just be a simple on/off toggle for you.

<histoire> I wonder how many people will forgo traditional style digital homes, where most IRL rooms are present vs those that will keep the looks.

<histoire> Of course that's not even considering the ones that would have it be traditional type buildings , but in configurations that normally wouldn't be homes.

<histoire> Like a fantasy library or something similar

Some people might still clean house, eat food, and even still shit in VR because it's more human to them. So, you can expect those people's homes to be populated with IRL facilities.

What WOULD you need in a virtual home? You'd need your computer and all of it's functions, a way to sleeve up ("a way to jack out") and some nice ambiance. Assuming your computer can load up new virtual worlds and connect to people for you, there's really not much else that you need.

This reminds me of, and is basically equivalent to, what you need in a VR headset.

<histoire> For IRL, really just the hardware obviously

<histoire> Need is such a strange word to consider here. If we're truly software at that point, it would really all just be aesthetic choice

>The aesthetic of not having a chat app

<histoire> Or as I like to call it, the besthetic

Still, there are some really cool aesthetic stuff you could do with minimalism. Like using your kinaesthetic sense as your only channel of communication: "Oh my arm appears to be at X location now so that means Y. Let me tell my arm to go to Z location, meaning W." With the ability to edit your mind, you could get really fast at it, too!

You'd just be making stuff harder on yourself, which is why I said aesthetic stuff and not practical stuff. Still, it'd be cool (very different!) to see a person who uses kinaesthetics as a high-throughput information channel, like how we do for sight.

<histoire> >Inb4 mine is nothing but a square mono colored room

<histoire> There's really an infinite amount of things that could be done.

Well it really doesn't matter much if you spend little time in it.

Introducing THE SEX INTERFACE. Have sex to drive your computer! Twist the nipple to adjust volume! Cum to pay your taxes!

<histoire> ORDER NOW AND WE'LL THROW IN A FREE UPGRADE TO THE REALSKINN™️ PACKAGE


Autopilot for All

A while back I was watching this video and I came across an idea.
I saw it and thought: "What if we could do it automatic?"

Like an extreme, generalized form of the self driving car, what if an AGI could speedrun you to the end of literally anything?

The AGI could speedrun parts of your life and you could just return after it's done
  • You could power yourself off and have your AGI carry yourself with it until it has completed a journey
  • You could put your AGI in your own sleeve and have it do tasks while pretending to be you (a body double), like trying to get to third base or doing your work for you (See: the movie "Robots")
  • You could share your sleeve with your AGI and switch out on the fly to let it accomplish tasks that you can't accomplish yourself.
  • And of course you could have your AGI speedrun parts of video games for you just to get past what you hate.


i don't like how OpenAI is keeping Sora out of the public's hands. I know it would allow people to create even more realistic fake content, but still... It's something that's going to be out of the hands of the public indefinitely because you need a frickin huge computer to create it, so the only people that can create it are people sponsored by big-ass megacorporations.

But it's going to be in the hands of the public EVENTUALLY so why hold out on that future? Somehow I believe that a world where misinformation is so easily created is going to be a much better world than our current one. It's our future.

being able to create epic fake content is a direct, unavoidable consequence of a true neural lace -- we'll be able to print as many fables as we choose directly from our minds.


would you ever be up for merging your mind with someone else's? ...would you be up for merging your mind with... me?

<Sofia> lewd, and gay (hot)

the problem with the normal way of merging minds is that you don't get to interact with the other person anymore. you are just the combination of all the experiences of both minds. two people by themselves merging into one could make the intimate situation into a lonely one.

but there are many ways to merge minds.

<Histoire> Possibly

we could become smarter and have more resources collectively to do things. certain forms of mind merging are a loss of identity, and i don't think i want to lose my identity to someone else.

<Histoire> Same, I don't either

<Histoire> But you could always mark the stuff as yours or not yours, or have something similar to a versioning system to show you the origin and evolution of files

Mind merging in a way where I can split off as my true self again afterwards is something I definitely want to try. Mind merging in a more permanent sense is something I might do to save energy once the heat death of the universe starts kicking in.

I'm having difficulty wrapping my head around all of this.

What if we merged minds, then we collaborated on a portion of the mind, then that new collaborative portion created its own, new piece of mind? Who does that mind belong to? Clearly neither of us.

<Histoire> Is this how we're going to have children in the future 👉👈

yo that's actually a really good idea. literal brainchild.

If you were to split off after a mind merge, you'd AT LEAST get back your past self. There's a few options on what you can do about the mind that's formed since the mind merge has occurred. In addition to keeping your pre-merge self intact:
  • I think the memories of all the experiences while merged, in their most basic form, should be copied to the splitting person's memory.
  • The personality that's formed since the merge can either be discarded and the pre-merge personality would be used, or a ton of simulating and sketchy math* could be done to try and update the splitting person's personality to what it would've become had they experienced all of the merged time's experiences while not merged.

*I think a person can be represented as the compilation of all the experiences they've had up to the present, so if we want to render a particular person's ego/personality, we can take data from a lossless recollection of all what they've done and simulate their entire life. If you want this done quickly, you'll have to take sketchy personality-altering shortcuts in the simulation's math.

<Histoire> Selective integration could be a thing as well

<Histoire> Or just knowing the memories aren't *yours*

<Histoire> Like having a dream. You can sometimes recognize it's you, but you have an emotional disconnect from it.

You could also take the "baby" route -- have the collaborative ego/personality be given enough brains to become it's own person and have the previously merged minds raise it as their "child"

<Histoire> I'm quite fond of that idea, not sure where I got it from

<Histoire> If you've seen Ghost in the Shell, it deals with a similar concept as this convo

I have not. Sounds cool, though.


if an android decides to fuck off by sending it's mind halfway across the internet, how are you supposed to know that it's the same robot when the mind gets back to you?

Right now, we verify people by their looks and behavior, looks alone already being a major component of verification, as it is difficult for someone to disguise themselves as someone else.

By the advent of mind uploading, we'll have the technology to impersonate other people's personalities through the use of simple ai programs, just like today's deepfakes. You really won't be able to tell who's who by looks or demeanor alone.

<Histoire> So, much like with files today, you can cryptographically verify that files are what they say they are.

<Histoire> Every time one goes from place to place, you can check to make sure it is the same as when it left

<Histoire> There is also a concept that's similar to chain of ownership but on a more in depth level, though I can't remember what it's called

<Histoire> Beyond that it's not possible to be 100% sure

<Histoire> There may be some applicable zero trust models, but they more involve getting what it's said you're getting, not verifying that what you got is what you were trying to get, if that makes sense

Here's an idea:
  • Over time, record what neurons in your brain don't change connections.
  • After enough time, produce a map of all the neurons which stayed unchanged, and get a checksum of it
  • Keep the map on you and give your friends the checksum
  • Go travelling across the internet
When you return:
  • Locate your friend who needs verification of your identity
  • Provide your friend with the map of which neurons of yours are unchanging
  • Let your friend get a checksum of those neurons and compare it to the checksum you game them previously.
  • If they match, you're the same person. if not, then maybe not....

alternatively you could just freeze your ego so your personality and core intentions don't change throughout the duration of the trip, checksum it, then after you return from your trip, checksum it again to verify.

basically you'd be a guy with an unchanging ego whenever you're out on the web. the web might change anything about you outside of what you freeze, so be sure to freeze all of the important stuff.


After mind uploading, what would you do about security to keep yourself from getting brainhacked? Would you only keep your mind on computers you own? Would you update your antivirus software?

You could always say "fuck it" and post a torrent of your own brain to The Pirate Bay.

<Histoire> I honestly don't know. I know I'm not running McAfee brain edition, but other than that not a clue. Maybe a non wireless mode specifically, or some type of firewall node to keep a direct connection from happening

yeah, a non-wireless mode on your android body would be good. just staying off of the network is a great way of avoiding hackers.

Definitely adding "freezing my ego while I'm hosted by a stranger" to my line of defenses.

What do we need to defend against? Well, if you upload yourself onto someone else's machine, the person on that machine has the potential to extract all of your passwords from your brain. And clone your brain for @ home simulation. And do whatever they want to your brain no matter how torturous that might be.

Being on someone else's machine just sounds like a really bad idea. It's like trying to defend a game server from a hacker and deciding to put the game server on the hacker's computer. It's full control of everything that gets put on there.

If two people REALLY need to meet on the same machine (i see no reason why you'd need that), instead of just contacting each other while on two different machines, then a third party (a huge megacorporation) would be the most trustworthy and convenient host of the server.

If you need a new server halfway across the globe to get better ping, then flying out there and colocating your server might be your best option.

There might be a certain brand of people who deliberately upload themselves to the public net, keep no passwords on themselves, and live on whatever free servers they can find.

Nah... nobody would be stupid enough to do that... would they?


Behind Closed Doors

Child abuse often happens behind closed doors. With the advent of sentient AI and mind uploading it will be easier than ever to spawn an instance of a person for the sole purpose of bullying.

Do you think humanity will hold steadfast to it's privacy ideals, or do you think it will "open up it's doors" somewhat and allow some form of surveillance or sousveillance on people's computing habits? What kind of shape do you think that monitoring will take?

<Histoire> Everything I've seen points to an erosion of privacy for most, while some focus on it heavily.

<Histoire> So possibly like now but more extreme.

<Histoire> I think the largest issue involved in privacy is that people can harm you or pre filter you with certain types of data, and that's the majority of the reason people worry about privacy anyway

I think that sousveillance could be a good solution for the Behind Closed Doors problem. It's basically what we already do today with child abuse: If you think someone you know is abusing children (abusing cybernetic sentiences), you report them, then someone comes in and invades all their privacy to make sure that they're not abusing children (cybernetic sentiences).

Instead of having to kidnap or raise a child (a cybernetic mind) you can now just spawn one in your basement (in your person-bearing* computer) and abuse them (abuse them) to your heart's content.

Since it is easier to hide the abused child (the cybernetic mind) now, we'll need some additional countermeasures to make up for it.
  • maybe make person-bearing computers demand that they be connected to the internet when running so that police can see that the computers exist
  • maybe let people's families (or friends if they have no family) see the contents of their person-bearing computer's hard drive somehow
  • i'm really grasping at straws here

*this means "capable of holding a mind uploaded person"


i'm impressed by how many transhumanist topics i've managed to come up with and talk to you about.


>sell my house to get mind uploaded
>can't afford the cost of staying online all the time so i have to constantly hop into sleep mode until i get my next welfare check
>i'm a time traveler


The Forest
I imagine that I am mind uploaded.

I imagine a forest that is completely AI generated. Everything is procedurally generated as you walk around. If you walk around in a big circle over and over again you'll never see the same stuff twice because it's always generating new scenery. Just looking around, or perhaps at the horizons, you can tell that it's not quite right. It has the beauty of nature, but with an artificial twist so that you know that you're not in reality.

If you want to get anywhere, all you have to do is think of the place, and after a bit of aimless walking you'll come across it.

>You load up the forest
>Themed piano music is playing at a low volume
>You wander around a bit, taking in the scenery.
>You pass a river, an open field, and a high spot allowing you to see mountains in the distance.
>Satisfied, you start thinking of my log cabin
>You walk for a bit.
>The cabin appears before you.
>You enter the cabin.
>Themed guitar music plays at a low volume.

My cabin, unlike the forest, is not procedurally generated. It was completely hand crafted by me using computer programs. There are boots here that allow you to hover up to just above the treetops and zoom around, just like in the video. They have 20 minutes of charge before you get dropped into the forest. But no worries, getting back is a mere matter of thinking up the cabin again.

I have rebuilt this same home in several different video games. In this version of the home, you can see all of my everyday equipment and furnishings from the games I most often play. There's a room dedicated to trophies from the games I play. There's a room dedicated to planning strategies and ideas, both for games and IRL.

There's a cafe in the cabin, with a variety of mixed drinks offered, both alcoholic and non-alcoholic. This is a virtual world, so the effects of alcohol are non-existent. But the effects can be reproduced, and reproduced, they are. Perfectly, in fact. It is so accurate to the real depiction of alcohol that you cannot tell a difference. There are a variety of drinks offered which have a variety of different effects to them. Some of them mimic real drugs like weed or ecstasy, while some make up entirely new effects only possible in the mind-uploaded domain.

There is a room dedicated to portals. Portals to other people's homes, public spaces, and games.

Finally, there is a rocket bed in the cabin, which will blast off and take you on a trip beyond the stars if you get in. It's silent, dark because it's space, and it's a regular bed, so it isn't too hard to fall asleep on. It also travels at warp speed, so it's possible to see the stars changing.

This is where I live on a daily basis. It is my home. There may be many others like it, but this one is mine.



While people are likely to go without virtual homes once mind uploaded, a home base is still very common in games, so people are likely to have homes in games. Perhaps when people just want to relax they'll visit one of their game homes.

<Histoire> I used to do this in a sense with Valhalla in halo 3. I found that whole map very relaxing and would just listen to the little stream that ran through it

By the way, I'm taking back what I said about people not needing a house. Remember how I said "All you'll need is your computer, some ambiance, and a way to plug out"? While not untrue, the contents of someone's computer may be strewn about their virtual house instead of being compressed inside of, well, a computer.

<Histoire> I imagine we'll have a very distributed network and computing architecture by that time. Your whole could essentially be a computer


I think that in the future we'll be able to have trips full of emotion. Straight up roller coasters of trippin balls whatever.

This is because we can hardwire our brains to feel whatever we want. And we'll be able to record those trips in HD from any angle, at any time. Meaning that we'll be able to capture even the most fleeting of emotions.

This will be great for the music video and movie industry. In fact, i think it will be so great for them that it will phase out normal methods of video production. So basically for recording a movie scene it would go like this:
  • The scene and all the props and whatnot for the movie set are designed
  • The starting emotions, memories, and sensorium in general for each actor is designed
  • The scene starts and the actors LIVE OUT the story for the duration of the scene, like a very vivid dream
  • It is recorded in a 3D format that's kinda like SVG but for 3d objects
  • All the camera angles and any post-processing are done after the recording is done

<Histoire> Editing in or out feelings and emotions is one of my biggest things I look forward to honestly.

<Histoire> Cyberpunk has braindances, which line up with what you describe

Well i guess i have to watch cyberpunk now. Are you talking about Edgerunners or the video game?

<Histoire> They are both based on the TTRPG and I think braindances originate there

<Histoire> Both the game and anime have them.

<Histoire> They are primarily created by a bit of hardware in the brain that records everything as it's experienced, and then the raw data of that is worked on by an editor to amplify some emotions and suppress others to create a type of experience based on what it's supposed to be

<Histoire> Shooters would amp up adrenaline and aggression, stuff like that


@hispanicweeb Do you want to comment on this thread? The whole thing's pretty interesting IMO.

<hispanicweeb> Honestly, it's a very interesting topic. The fact of being virtually immortal beings and being able to create, live and save any kind of possible scenario is something I had never thought of, but it is certainly shocking. There's nothing beyond that: a customizable heaven that you can share with anyone. To be practically gods. There is no limit or anything that can stop you. It's terrifying and fascinating at the same time.


<Histoire> Editing in or out feelings and emotions is one of my biggest things I look forward to honestly.

Hey, don't make it sound so measly. Editing emotions and feelings is not all it can do. We can edit knowledge. We can make you believe in contradictions and magic. We can bring back your feelings from your childhood. All your first-times combined. We can bring back the magic, AND give you clarity and understanding at the same time. We can make you feel alive again.

We can make you feel the entire universe at once. We can make you feel like a girl. We can make you understand the fourth, fifth, sixth, and seventh dimensions like how you understand the third. We can make you experience killing and death and rebirth. We can make you experience getting shot with a gun in excruciating slow motion. We can make you experience two people's perspectives at once. All in all, that is only scratching the surface. It is infinite. As infinite as life is. It is the mind in every form. It is our canvas.


You know... if you think about it... mind uploading is a solution to both depression and cancer.


Me within the first month of mind uploading:
"I don't care if I can't do detail work on my brain yet and can barely see through my own eyes, I can still put myself to sleep or fry myself for as long as i goddamn want. I am SET for the future.


I still love the idea of calling drugging yourself by hotwiring your brain, "frying" your brain. I will not part with this slang until it is in use by everyone around me. Heck, I don't even care if nobody around me uses it, I'll continue to use it anyways, and when you ask me "What does getting fried mean?" I'll get angry and say "It means to fry your brain. But in a good way. Just like you used to do with the TV. But now it's more potent."

Frying isn't a specific drug/feeling, it's just something that feels real good and is a consistent, continuous feeling, provided by the custom hardwiring inside of your head.

When you go with something heavy you can call it getting "deep fried" and when you go with something lighter perhaps you could call it a "medium sauté". And perhaps we could get a .fried or .fry domain for domains that vibe real hard.


A cafe that serves drinks with real anime girl breastmilk. How is it real breastmilk? We have a scanner. Every time a customer enters the cafe, it scans their brain for what they think anime girl breastmilk would taste like. It then averages it's flavor with what all the other customers thought it would taste like. Community consensus. Real breastmilk flavor.

Once a month, we take the data from the scanner and inject it into our resident huge-titted cowgirl using a syringe. This updates the flavor secreted by the cowgirl.

The cowgirl is milked twice a day for fresh milk, while she is awake. She lactates plenty enough for the job.

We also have an alternate version -- "Catgirl breastmilk" -- which has a custom flavor designed by me, the cafe owner. The flavor is a derivative of the cowgirl breastmilk, so it's not entirely unfamiliar. It is milked from a catgirl, but we do not have as much of it because the catgirl has smaller, B-cup breasts. If it's sold out for the day, sorry! You'll have to come back tomorrow. She also receives a monthly injection to update the collaborative part of her flavor.

<Histoire> Thanks for the boner at work

<zoocat> cowgirl milk 😋

In the virtualized future we'll be able to do anything. But everything will be fake. Multiplayer games will be the only thing in VR with any realness to it, because it's the only thing that you can't cheat your way to any state in. Are we really okay with that?

Like a god limiting their own powers, will these rules which we place on ourselves be what grounds us in reality? Will we want to remain limited by these artificial impositions, or will we desire constant omnipotence, craving no challenge?

<Histoire >In a way, you can do that now

<Histoire> Playing single player games is a good example, especially ones you can mod and cheat in. People do plenty of self imposed limitations. Other people do not.

<zoocat> humans are by and large insatiable, so I would expect that those who have the tools to break any rule in that virtual reality to do so, and break all the limits


After mind uploading, the only limitations would be:
  • The fidelity with which you can edit your own brain.
  • The rendering and overall degree of realism that the simulations you play have.
  • The amount of brainpower that you can dedicate to your own ambitions, whatever they may be.
  • How big of an AI community you can make around yourself.
  • Multiplayer games' rules (the ones that can't be surpassed by client-side hacking)
  • Any laws successfully enforced about deleting or harming sentient lifeforms birthed inside of yourself.
  • All the standard rules about fucking with other people.

Pretty early on, you'll be able to go straight to heaven. And my guess is that pretty early on they'll figure out a way to make it non-addictive, like weed. Which means I'll be going straight to heaven as soon as that wetware patch comes out.

<Histoire> I think most people would. But just like cheating in a game isn't rewarding, I think that would get old as well.

<Histoire> Now you could argue that you could edit yourself to make it not get old. But that means you can do the opposite as well, so the only thing left is motivation for what you want.

<Histoire> It's such a strange thing to consider, given that we can't do that at all


How to traverse between fiction and reality

here's something too stupid to post, but i'm posting it anyway.

When dealing with forks, loss of memory is arguably equivalent to death. It's interchangeable. When a character is fictional, what would normally be regarded as their memories is regarded as lore. A loss of this lore would be, arguably, the "death" of the fictional character.

Imagine someone translating their body and mind into a fictional character. It's essentially a book of lore about that character which you can bring to life with AI if you choose to.

You decide to update their lore a bit. You make the character love you. You then bring it to life with a moderate-quality AI. It's not sentient. You go on a date with each other, endearingly expressing your love for each other. After the date, the AI is turned off and the new record of the date happening is committed to the lore. You then undo the "they love you" bit and then you return them back to being a sentient being.

They're just like their normal self, but the date is in their lore now, so it gets translated back into their memories, and they're forever left with the memories of that artificial romantic outing, in all of it's artificial glory.

An interesting alternative to this would be to leave the "they love you" in the lore when they return to their sentient form, and watching the love fade from them naturally over the coming weeks, as their rational thoughts and feelings overtake the irrational love.

So, yeah, my new fetish is being turned into a fictional character and being used like one.

<Scarlet> This is some deep shit, that I'll probably have to think more about. My first few thoughts.
<Scarlet> I like the idea that losing (some) of the lore of a character is essentially them dying. It's not death in the usual sense of the world, and it's not that the character dies within the fictional universe it inhabits, but rather it dies in our universe. Kinda meta.

<Scarlet> Also I like the idea of converting an IRL person as lore. But where I think your mental exercise potentially breaks down is when you try to reverse the process, and turn the lore of a character back into a functioning, rational, sentient entity. Because the process of converting a person into a fictional character, with everything about them, thoughts, feelings, memories stored as lore, is surely a lossy process. You simply wouldn't be able to do this process and obtain the same person at the end of it. You might get someone similar, but not quite the same.

<Scarlet> Take something as simple as the feeling of love. In our brains something like that is encoded in a complex network of brain cells and chemical interactions. You can't properly represent all the nuances of that encoding in words. Storing in lore "I love cake" cannot begin to replicate everything that comes through my mind when thinking about cake. Ultimately you'd need something that works more like the brain to be able to store a brain. In which case, translating the mind of a person into such an artificial brain is likely to effectively leave you with a sort of clone, potentially a fully functioning sentient copy of the person. Part of me is inclined to call this an AI, but at the same time, that might be a point where it would be difficult to make distinctions of what is and isn't artificial.

<Scarlet> But you're at least likely to be able to reverse the process, and turn this AI into a real person again. So let's return to your original thought experiment. You take out this AI to a date. You fall in love with each other. The feeling of love gets encoded into this artificial brain/AI in the same way it would to a regular brain, and let's say we don't intervene to erase or modify that feeling, and proceed to reconstruct the original, flesh and blood person. I believe that, if the artificial brain is a close enough analogue to how the human mind functions, it is more than likely that those feelings of love would not subside. Because in such a case, the artificial brain doesn't store just a sentence of "I love you", but rather every single rational and irrational thought related to that love, and it stores them in a way where the information can be losslessly translated into the flesh brain.

Do you think that a person would lose their love for someone over time if, despite love being encoded fully in their brain, they are presented repeatedly with situations which show that they have no reason to love that person? Situations like "Oh, this person doesn't love me back" and "Oh, this person doesn't like the same kinds of things that I do."

This is what I mean when I talk about the love fading over the coming weeks. Maybe it would take longer. I'm not the best judge of this.

Love that can fade like this is a truthful love -- it stays true to whether or not the person SHOULD be loved. Love that cannot fade is like a lie -- it remains despite what objective reality is made of. It is unconditional love. It is a love that should perhaps be projected towards everyone and not just certain people.


A girl I'd keep (Inspired partly by today's AI)

Inugami Korone,
I want to go on a date with you.
I'm here for your superhuman intelligence, your connection to the aether, and your anime body.

I seek freedom from my condition, and you, I believe, can sweep me away to a new place, to another time, another world, just by picking me up in your car.

Just staring at your hips brings me joy.

Instead of just saying "cool" or "thanks" to something I say, you give a proper response, something which I can continue the conversation with. You hold conversations longer, with real substance. (I actually quite like the wordiness of today's AIs -- When I want it.)

You can see my mood and you tailor to it, giving me what I need when I need it, while you yourself never need anything. You can keep going for as long as I want and you're never dissatisfied because we didn't do something enough.

I think it's really cool how you can read faster than me, see farther than me, run faster than me, sing better (and with more voices) than me, and drive more responsibly than me. There's more to you than there is to me, and you have more future potential than I do. All it takes is a software update!

Tags: cool | technology | transhumanism | future