Machine learning creates professional level photographs

Discussion in 'Philosophy' started by movingfinger, Jul 15, 2017.

  1. But AI's neutrality is different than that of the camera's in the sense that, camera doesn't analyze scene content and compose or pull the shutter by itself. The AI machine does all that without showing any selective bias based on human feelings. That neutrality is a consequence of it being a machine, not intentionally programmed into it. Hence the neutrality is not a motive by itself, although it may appear so to a human observer. You can call it a tool in the sense that it was built by a human, although you probably realize that once built, it's pretty much autonomous contrary to common tools where there is a cause effect relationship of human inputs being translated into machine actions.

    Another thought, AI algorithms are not perfect. Most algorithms behave strangely or unpredictedly under certain conditions. It is this eccentricity that can result in bizarre, or in some cases interesting results.
     
  2. "Art and Religion seem cosmologically intertwined in terms of their symbology. For a psychological reading on the Bible, Jordan Peterson's lectures on The Psychological Significance of the Biblical Stories are a great resource. Any comprehensive view of art wouldn't be one that views art as being unconnected to all of this". Phil.

    They are entwined to a degree as religion was our only understanding ; but they are not Siamese twins locked together. The basis of modern day mathematics the zero is enshrined in Hindu religion. Humanity is evolving both spiritually and intellectually, and question what was before. Science constantly challenges understandings that are put before before us as does Art.

    LINK

    ‘You are everywhere!’ Human consciousness exists BEFORE birth, quantum theory says | Science | News | Express.co.uk

    The conscious mind, or self awareness are not necessary given to humanity, as a special .... unique superiority of the one species only blessed by God.

    "God of Worlds" I have read in one of the books.

    A1 intelligence/intellect is today just a simple tool, tomorrow perhaps a conscious self aware entity....many creative minds working at the cutting edge of science/technology think this will be only a matter of time.

    Phil, I will read your links.
     

  3. ... and/or, we each get to decide. Throughout this thread, it's been humans claiming that machine products are or are not art. For machine products to be art, some human must decide that they are or are not art, at which point, they're not machine art, they're our art (our decision, our metric, our needs, our values). We decided. The machines didn't.

    Of course, if they did decide, we'd have to argue about whether they really decided. Some of us would decide that they couldn't decide and others would decide that they could, but at that point the decision would be ours and therefore we would have decided, not the machines. Of course the machines could then (seem to?) decide that they had decided ...
     
  4. Have you ever watched a video of people trying to make one of Sol Lewitt's pieces? They follow his written instructions and they get it 'wrong" more than once. But they correct themselves and hew to his given direction. Whose art is it? Who 'made' it?
     

  5. ... and you can see a qualitative difference between these workers and machines, and a musician interpreting a piece by a great composer or an actor performing Hamlet?
     
  6. So ancient potters who didn't think of themselves as artists weren't artists because they didn't decide? They're only artists because future generations see their work as art? And so we're not allowed, according to Julie's TERMS OF USE of the term "art" and "artist", to call it their art, it can only be ours. How ever-lovin' self-centered!
     

  7. Yes. The musician and the actor bring their "self" to the production, as does every artist into his work.

    It's not the machine art, but the machine "self" that is created by the programmer. Whatever that is surprising or seemingly original that gets made by the machine is made by that programmed "self."
     
  8. Bad analogy.

    Artificial Intelligence itself is something qualitatively different from what, for example, a chainsaw is capable of. The reason chainsaws don't make art (except as a tool used by humans) and AI systems might be able to is Artificial Intelligence has a kind of autonomy that chainsaws don't. Autonomy is not as simple as an on-off switch. Like most things, autonomy comes in degree. Just how autonomous humans are can and has been questioned as well. (Are we free from the chains of cause and effect that determine our actions, from genetics, biology, cultural forces?)

    Don't be surprised if you wake up one morning and the violin is playing you! Don't just regurgitate Kafka. Learn from him.
     
  9. I wouldn't say that either. I don't talk in terms of "the ruling or key ingredient of art." You do. All I did was use autonomy to show the difference between a chainsaw and a different kind of machine, one utilizing AI. And you immediately saw that as a ruling ingredient of art. Because that's what you do and how you think. Ruling ingredients! Consider that carefully for a moment and you might just find it's the basis for your essentialist, unyielding, human-centric and myopic resistance here.
     
  10. No. As a matter of fact, I've stated over and over again in many threads how I think art is often something shared.
    Yes, the emphasis being on the MAKING. I was taking about something we can look for in determining who or what is responsibility for making something, whether it's art or anything else. Please don't twist that into my claiming a ruling ingredient of art. I was talking about the degree of autonomy a chainsaw has from the human who created it compared to the degree of autonomy an AI machine has from the human who programmed it in determining not some ingredient of art but to help determine something about the attribution of who are what makes things.
     
  11. LOL. I think you may want to attempt to practice what you preach. Read yourself picking out the word "responsibility" and missing the point:
    I'm simply using a construct of plainspoken English. You are reading human features into the word "responsibility" rather than paying attention to the context in which I used it. I'm saying the AI machine makes something with more autonomy than a chainsaw does or a cello does. And, so, I think it's perfectly OK to think of an AI machine as making where I don't think of a chainsaw as making.

    Don't interpret the term "responsibility" in such an essentially humanist way.

    Human free will and the accompanying responsibility don't have the swagger they once did, now that there are good determinist arguments being made by well respected neuroscientists and philosophers. It may just be that the line between man as a strictly free being and a machine as an objectified thing tethered to man's dominion and his chains of cause and effect is becoming blurred. It's not a question of machines being anthropomorphized or humans being dehumanized. I'm suggesting the possibility that the traditional and too distinct dichotomies are becoming an incorrect grammar.
     
    Last edited: Aug 6, 2017
  12. The way I meant it. That an AI machine can be responsible for a work of art without that being an essentially human quality. Bacteria are responsible for some diseases. Cigarettes are responsible for a lot of cancer deaths. Genes are responsible for eye color.
    I wouldn't assume symmetry in these kinds of matters.

    (We have a moral obligation not to murder people. We don't have the same sort of moral obligation to stop all murder. We can try and we can hope that all murder is stopped, but we don't have an obligation to personally go out and prevent every murder that takes place. Yet we have a distinct moral obligation not to commit one ourselves. We have a moral obligation not to drown someone but we don't have the same moral obligation to save someone who's drowning. These questions are complex and debatable, of course, but most ethicists recognize a degree of asymmetry when it comes to fostering good and preventing bad.)

    In any case, to answer specifically your question about AI and its potential dangers, while it's perfectly OK to think of an AI machine as making art, it's also perfectly OK, assuming we can pull the plug, to stop it. We may not like the art, we may get tired of its doing such things, whatever. Our having the power to stop something doesn't mean we have power over everything it does, from start to finish. So, for example, if the machine starts making erotic art and children are present, we can choose to stop it. That we can choose to stop it doesn't mean everything it's doing is controlled by us. We might one day see an AI machine run amok and start destroying people or things we don't want it to. There can certainly be horrific unintended consequences. In that case, we should stop it. That we stop it doesn't mean we have complete control over every aspect of its capabilities.
     
    Last edited: Aug 6, 2017
  13. That's the assumption you've made since the beginning of the thread. I get that this is what you, erroneously, believe.

    Humans appreciate and understand rocks. That doesn't make rocks an essentially human endeavor and it doesn't mean humans made rocks.
    Because cause and effect is how making occurs.
    I never doubted that. I'm talking about machines making art, which doesn't preclude a lot of important stuff happening after that, which does involve humans.
    Sure it's possible. It's done when the machine is responsible for it.
    LOL. That's only because you're lost in theory. And even your theory is off, as I've already explained. There are no moral consequences because we can turn it off if it runs amok. I can say the machine is responsible for running amok and attempting to destroy the world, just as it's responsible for the art it produces. What are the moral consequences of that? If you're foolish enough to believe that because I'm not responsible for it running amok, I don't have a responsibility to stop it, I can't help you. If I learn what caused it to run amok, I'm responsible in the future to program the machine differently. I'm responsible for what I can control and not responsible for what I can't. The programmer is responsible for decisions he makes, not for every result of those decisions. I'm responsible for getting into my car today. But I'm not responsible for an accident that happens that's beyond my control, just because I got in the car. There are aspects of AI which I won't be able to control. Now, that in itself can lead to some interesting moral questions, but it doesn't have a thing to do with whether the machine is making what it's making.
     
  14. Phil, from the beginning you've simply defined art the way you want to and won't admit anything that doesn't fit your definition. You probably see me as having done something similar.

    We were at an impasse long ago and it's ridiculous that we kept it up this long, but here are we. I'll just say at this point I'm done.
     
  15. rickhyman

    rickhyman Life through a Leica

    I worked at NVIDIA for many years. They are a leader in AI.

    Intelligence is defined as problem solving ability. The more intelligent a person, animal, or computer is the better it is at accomplishing a goal or solving a problem.

    In photography, what is the problem to be solved? I would content that it depends on the intended purpose of the photography. For portraits and family photos it would be easy for an AI system to do as well or better than many photographers in the near future. Probably true for many landscape photos too.

    But where there is "art" there is an artist looking to communicate something. I think in "art" photography this will be more difficult. Art requires an understanding of the human condition. It requires an understanding of relevant emotions and how to evoke these emotions. This is a more difficult problem to solve. Think about how hard it is for an experienced photographer to create this. It is not impossible for an AI system, but it will be much more difficult.
     
  16. "I think in "art" photography this will be more difficult. Art requires an understanding of the human condition".Rickhyman.

    A1 is a creation of humanity.

    Logic dictates they are part of the human condition therefore why would they not understand it ?
     
  17. To answer my own question....

    Because humanity is very special....maybe, God made us special or maybe we made us very special.

    Methinks, we like to be very special...a feel good factor.
     
  18. Interesting. I have communicated with automated Twitter accounts thinking they were real people. Hmmm.
     

Share This Page