would a artificial human have rights?

28 posts

Flag Post

Lets say it’s 20XX and man has found a way to artificially make humans down to the last detail. meaning they can feel pain, love, hate, learn ect
do you think they would have rights? why or why not?

 
Flag Post

The real question is why would we do that? After all the procces of making a new human is fairly simple (a bit time and resource consuming before adulthood but still) And itś hardly like we are moving towards a future with to few humans in it.

 
Flag Post

thijser, it’s not a problem of how many (quantity) humans are in our future, it is a factor the quality of them….reduced disease susceptibility, elimination of genetic maladies, etc. We are largely talking about genetic engineering something along the line of old-fashioned selective breeding to produce a more desirable crop/herd.

There are thousands of genetic goof-ups that could be corrected….maybe every easily in the future—20XX.

But, 15man is likely talking about androids; something that doesn’t come from an existing life form, that is something altogether different than the above. He is proposing we invent a very complex machine….one that isn’t ALIVE.

I ask this: would this machine still be OWNED by someone?
If so, we likely could give it rights that only befit its property value.

However, this movie from this book very well addresses this “problem”.

 
Flag Post
Originally posted by 15man:

Lets say it’s 20XX and man has found a way to artificially make humans down to the last detail. meaning they can feel pain, love, hate, learn ect
do you think they would have rights? why or why not?

We actually already can. We just duplicate Genes but, this is HIGHLY illegal! But, let me answer your question with another question, when the whites found the blacks, did they have equal rights??? No. So I think that they wouldn’t for a long time but, maybe some day.

Originally posted by karmakoolkid:

thijser, it’s not a problem of how many (quantity) humans are in our future, it is a factor the quality of them….reduced disease susceptibility, elimination of genetic maladies, etc. We are largely talking about genetic engineering something along the line of old-fashioned selective breeding to produce a more desirable crop/herd.


There are thousands of genetic goof-ups that could be corrected….maybe every easily in the future—20XX.


But, 15man is likely talking about androids; something that doesn’t come from an existing life form, that is something altogether different than the above. He is proposing we invent a very complex machine….one that isn’t ALIVE.


I ask this: would this machine still be OWNED by someone?
If so, we likely could give it rights that only befit its property value.


However, this movie from this book very well addresses this “problem”.

Yes but those genetic “goof ups” (mutations) are how we adapt. You never know if one day a disaster will strike and those with the mutation will survive and those without it wont. So making these artificial people could in reality make us much more susceptible to diseases and such.

 
Flag Post

…when the whites found the blacks, did they have equal rights??? No.

Some of them had equal enough rights to become Roman Emperors.

 
Flag Post

no because CROWS will not be giving their human slaves any rights :>>>

 
Flag Post
Originally posted by beauval:

…when the whites found the blacks, did they have equal rights??? No.

Some of them had equal enough rights to become Roman Emperors.

facedesk talkin about when White Americans found blacks and made them work on plantations. lols :)

Originally posted by RollerCROWster:

no because CROWS will not be giving their human slaves any rights :>>>

No no my friend you have it backwards, The humans don’t give their slave crows any rights :P

 
Flag Post

beauval, I told ya that you should have put out the extra bucks and gotten that Mind-reading Chip at your last refitting.

 
Flag Post
Originally posted by karmakoolkid:

beauval, I told ya that you should have put out the extra bucks and gotten that Mind-reading Chip at your last refitting.

He can have it, but it’s not going to do him a lot of good. Implanted in beauval’s head, it’ll read his own thoughts as opposed to anybody else’s. (oh and yes, such devices do exist)


As to the OP, I’m assuming they’re referring to organic humans, as opposed to embodied artificial intelligences. The OP’s too nebulous to really tell, but ‘exact in every detail’ would tend to suggest genetics.

It’s quite possible we may at some point decide to create a being with an engineered genome. Think Gattaca here. Or the Outer Limits episode with the same theme but a different take on it whose name eludes me. Using gene engineering to create smarter, stronger humans. A species to replace homo sapiens sapiens as the dominant lifeform on the planet.

Certainly doable from a pure science perspective, but from a rights perspective, they will probably start out with equal rights as designer babies, then have that status questioned as their physical and/or mental superiority becomes more and more obvious. Ultimately there would come an ‘interesting time’ when the issue of whether such beings should still be classified as humans, the equal to humans as they are slowly edging untampered, ‘pure’ humans out of every field, and thus pose a threat to humanity as it were.

I can quite easily see a near-future world with three ‘separate’ species sharing a common genetic profile, sharing this world.

  • ‘Natural’ untampered with humans evolving blindly the old-fashioned way
  • Highly genetically engineered specimens with enhanced bodily control, muscle density, reworked spinal structure and genius-level intellect as standard
  • Technologically augmented individuals with inorganic parts replacing part or all of the original body, members drawn from both the above camps.

As both the latter groups are intrinsically going to have capabilities well above those in the first group, it’ll come down to number distributions, as to whether the latter two groups keep equal rights, or have those rights more and more brazenly contested by old-fashioned humans who fear they are losing their place at the top of the food chain; becoming obsolescent. A fear-based reaction to beings with far more capability than they, competing for the same jobs and living in the same environs as they.

 
Flag Post

If you’ve got any memory chips, I’ll take a few. And if someone builds an android, he’ll be needing some.

 
Flag Post

Are we talking about people who are the product of genetic design, like something out of Blade Runner or highly sophisticated androids?

Either way, I would probably support their rights to humane accommodation as I would a normal human. If they were done precise enough, would we even know the difference?

 
Flag Post

Probably yes, unless you’re conservative guy, who thinks true AI can’t be created, because he was told so by certain institution.

Revolution, anyway. I would support AI-kind.


While I think, we would fight for AI’s rights, I don’t have 100% certainty… we could always ignore designing true AI’s or treat them like slaves forever (in some kind of galactic dystopia).

 
Flag Post
Originally posted by beauval:

If you’ve got any memory chips, I’ll take a few. And if someone builds an android, he’ll be needing some.

If only. They haven’t advanced to the level required to interface long-term with the human hippocampus yet, anyway. Soon as they do, I’ll let you know (Well, soon after, since the shorter the waiting line to get one, the better I like it).

An android or gynoid brain has the advantage there; there’s no existing wetware that their artificial memory has to interface with. Which means scar tissue buildup and implant rejection aren’t problems they have to deal with. Their memory will be designed to function optimumly with their brains.


Originally posted by yeasy:

Probably yes, unless you’re conservative guy, who thinks true AI can’t be created, becouse he was told so by certain institution.

What are you referencing, please?

 
Flag Post
Originally posted by beauval:

If you’ve got any memory chips, I’ll take a few. And if someone builds an android, he’ll be needing some.

Get in line, bub.
I’ve been in it for…for…hmmmmm… damn, I forget how long.
 
Flag Post
Originally posted by 15man:

Lets say it’s 20XX and man has found a way to artificially make humans down to the last detail. meaning they can feel pain, love, hate, learn ect
do you think they would have rights? why or why not?

This will not happen because everyone would be afraid they would have rights.

 
Flag Post
Originally posted by 15man:

Lets say it’s 20XX and man has found a way to artificially make humans down to the last detail. meaning they can feel pain, love, hate, learn ect
do you think they would have rights? why or why not?

We have rights?
But on a serious note, perhaps. Probably not as much exercised.

 
Flag Post

See Bicentennial Man starring the late Robin Williams. It’s a movie about one robot’s quest to become completely human.

 
Flag Post
Originally posted by Aleazor:

See Bicentennial Man starring the late Robin Williams. It’s a movie about one robot’s quest to become completely human.

Saw it (several times), loved it, highly recommend it…because its time definitely will come.
 
Flag Post

I think that if this ever happens, then no, we should just exterminate all artificial humans. This may sound harsh but first off, they are just a pointless contribution to overpopulation. Although, if we ever get advanced enough to make an artificial human that is the exact same we will have probably overcome overpopulation via colonizing other planets. If this is the case then we should still exterminate all artificial humans if we ever allow them into society because it is completely pointless, in fact it is really cruel. Just think about it, if you knew that every emotion you ever felt is because of what happened in a lab then you would just feel worthless and depressed and then you would think that THOSE emotions are simulated and then that these thoughts are simulated etc etc. Imagine on the flip-side, if you were in a relationship with someone who is perfect for you and then you find out that they are lab-made. You will be so upset to find out that ‘the one’ isn’t even a biological being. the only possible advantage to an artificial human is for lonely people, but if they are kept only for lonely people and they have human emotions then they will feel like tools, and if you take out the emotions that they would feel if used like that then the lonely people would be more aware that they aren’t talking to a human.

So to sum it all up, we should never make these things and if we do we should immediately get rid of them all.

 
This post has been removed by an administrator or moderator
 
Flag Post

Isn’t giving rights to an entity the way to make a human ?

 
Flag Post
Originally posted by powerpos:

Isn’t giving rights to an entity the way to make a human ?

We give rights to annimals but that doesn’t make them human. So no.

 
Flag Post

Well even if we give animals ‘rights’, they don’t necessarily get the same rights as humans

Do we make lab rats sign consent forms before experimenting on them? Do dogs accused of biting people have the right to lawyer? Do we give chimpanzees the right to vote? Do horses have the right to run for the senate?
(Ok so there was that one time Caligula made his horse as a senator… but everybody knows he was one of the most insane roman emperors to ever live.)

Most ‘animal rights’ are usually laws designed to prevent ‘excessive cruelty’, you can still kill your livestock for food…. as long as you don’t do it in a cruel or unusual manner (such as moving live cows to the slaughterhouse with a forklift or snowplow, or deliberately throwing live chickens against the wall) and to avoid being any crueler than necessary in regards to animal experimentation (Properly anesthetize your lab critters before surgery, keep them well fed and keep them in sanitary living conditions, etc) Even when animals have ‘rights’ they are still legally considered the property of humans. Animal rights laws merely put restrictions on how said ‘living property’ is used and minimum guidelines for care and maintenance of said property.

Things might get legally weird if we enhance the cognitive abilities of lab chimps to that of a human (perhaps a ‘dawn of the planet of the apes’ scenario or something involving cybernetic implants)

To further complicate things… Perhaps consider the unfortunate humans with mental disabilities that ultimately render them ‘less intelligent than a chimpanzee " or worse ’less intelligent than livestock’ or perhaps even people who are brain dead. They still receive all the same legal rights and protections as any other human being (albeit usually with a relative wielding power of attorney, etc).
What will become of them in a world where we start extending rights to uplifted lab animals on the basis of having a ’human level of intelligence"? (Seriously, past supreme court rulings have a huge impact on future cases. Such a hypothetical ruling might set a precedent that could have some unforeseen legal consequences)

Note: just because we’ve gotten it to work on monkeys or chimps doesn’t always mean we can get it to immediately work on humans (at least not without significant amounts of tweaking) In “Dawn of the planet of the apes” the apes were granted intelligence with the use of a virus used for ’gene therapy" (normally viruses insert their DNA into a host cell to trick the cell into producing more virus bodies instead of the usual proteins and such. The idea behind gene therapy is to genetically engineer the virus so it will replace a defective bit of host DNA with a working version of the gene) In the movie the virus affects humans and chimps quite differently (for the same reasons that chimps manage to harbor HIV without dying of AIDS) and ultimately the virus ends up being fatal to a large portion of the human race.

In regards to the example of using viral gene therapy to ‘uplift’ some lab animals…. We’re still a far way off technologically speaking, from getting it to work.
I remember in that during my biology class there was mention of viral gene therapy in my textbook (basically viruses reproduce by inserting their own DNA into a host cell in order to trick it into manufacturing virus cells instead of the normal proteins, etc. The idea behind viral gene therapy is to genetically alter the virus so it will insert a copies of healthy human genes) Unfortunately it ended up giving the patients cancer because the virus inserted the DNA into the wrong part of the chromosome (A gene needs to be in the correct ‘locus’ or location on a chromosome in order to work properly. If a gene is inserted into the wrong locus or it displaces something important it can cause problems. In this case the gene was accidently inserted into a locus that is important in regulating cell division, hence the cancer) At the moment we simply don’t have the tech necessary to ‘command’ a virus to insert a gene into the correct locus, so yes it’s extremely risky. (If I remember correctly the experiment in viral gene therapy was to correct a blood disorder, so they basically took out the patient’s bone marrow, exposed it to the virus and inserted it back in. For a while it worked and resulted in healthy blood cells, but eventually all the patients ended up with various kinds of cancer.)

 
Flag Post

Rights are currently designated by humans, being the apex predators of this planet. They are differentiated by species, so humans receive human rights, non-humans receive animals rights. Undeniably black and right you either have human rights or animal rights or no rights.

So as it stands the simple answer to your question depends on how the term “artificial” comes into play: a human who is cloned – whether grown in-vitro, a test tube or built “as is” – would retain full human genomes, and would be a human. Artificial robots that are predominantly being developed by the US Army, are not considered humans… which is a major cause for concern given that they are pressing now for legislation to allow their ‘killing robots’ to make kill decisions in the absence of a human. Other forms of ‘artificial human’ so long as they do not rely on the biological genomes representing the human species to survive, are not considered human and therefore have no human rights.

Of course, we make the rules so may choose to give such an artifical form its own rights, or even extent human rights upon it but as of right now, animal rights would be the best thing possible if that.

 
Flag Post
Originally posted by MarkNutt2012:

which is a major cause for concern given that they are pressing now for legislation to allow their ‘killing robots’ to make kill decisions in the absence of a human.

It won’t get through. There are too many problems with current machine vision algorithms recognising targets correctly in busy visual environments for any such deployment not to have massive negative political fallout the first time it’s used. That’s a short-term self-resolving problem only.

I’m personally more interested in the scenario when we do succeed in creating AGIs. Artificial General Intelligences or Strong AI is very, VERY different to the weak AIs we’ve been pursuing for the past fifty years. Not an algorithm or a set of algorithms intended to intelligently solve a specific task (which is the definition of weak AI), an AGI is specifically intended to be a human-level independent, self aware intelligence. We’re nowhere near successfully creating them yet, but the split in AI work between strong AI and weak AI work occurred a few years back, and we’re seeing a real push for the creation of strong AI.

When we succeed (and there’s no doubt we will; there are no theoretical barriers impeding the task) that’s when things get interesting. We’ll have completely artificial minds capable of rational thought and self-determination. That’s when artificial being rights really will matter.