x
  • IP Copied!
    Click to Copy IP
    0 Players Online
  • Join our Discord!
    0 Users Online
  • MineCraft Real or Fake?

    Discussion in 'Rant Archive' started by Burgerking_360, Apr 28, 2018.

    1. Burgerking_360
      Offline

      Burgerking_360 Active Member

      Joined:
      Mar 3, 2016
      Messages:
      26
      Likes Received:
      0
      Not enough 1080p graphics for the money of the graphics, the graphics would be more antialiased of they went to the 4kgraphicsp I can not believe how many graphics is not on this game, it's so funny to me that I laugh at this graphics on this game for me. why not just use the graphics from knack when you can have good graphics like 400k on ps4 but not enough RAM for megadrive for it to be good enough graphics for online mode. Thank you for not putting in good graphics on this game ok? This should be better graphics even on ps3 was out so you know im telling the truth on this one you guys its real. Microsoft says 1080k is not runnable on Sonyware but they are liar to me all the time and you know that playsystem can run even more than 400p on 60 frames per second at launch. So they lied again for the fifth time today. When you really think about how many graphics there are on this game it really makes you think that its not enough good graphics at all when you think about it. i lol at these graphics for the nintendo switch game too. nintendo should go back to making graphics for the old nintendo like nintendo wii because reggie said he would start making more wii games now that the graphics caught up to his original version. By the way he also lied when he said the he had completed the Knack trilogy for nintendo switch because it isn't even going to be out on the right graphics enough for the nintendo switch so thats how you can tell that he lied. At the end of the day it's really all in the graphics, like Reggie says "what happens in the Gameboy stays in the gamecube". Too bad Microsoft turned the graphics industry on it's head when they released the new Xbow Scropion King. It doesn't even need laserdisc to run it's graphics because of how much megabytes in can run through it's microchip. PC gaming is not a good graphics because that is just a computer games so you cannot play with a controller. By the way if you are thinking you are a real gamer than you should get playstation 4 because its graphics are much higher freamrate than xbow 360 because they have fourteen core processor technology and that's why microsoft will always be a liar in my opinion. And whats the deal with airline food? It's like ok just give me the playstation 5 so that I can finally have more pixels on knack hello microsoft? hello? ever heard of bloom graphics option and motion blur? you cant even run 60 frames per second on any of the new gameboys so don't even get me started again. If you say that nintendo can run those kind of graphics i will just lol at you for saying that to me. At the end of the day graphics are king and at the end of the day microsoft cna never be graphics enough to put their game on ps4 because of it's intense graphics chip. I dont think microsoft even hasgraphics chip because bill gates sais he doesn't believe in michaelchip so that's why pc gamers are free. its me burger if you are actually reading this right now my soul has been trapped in the gombo dimension, please rescue me. anyways when you really think about it graphics on a michaeldrive will never outperform a sonysoft because the frame rate is 100p faster on a bluray than in a dvd player like microsoft put on the xbome 300. face it nintendo you are just making baby graphics at this point that i can run on my xbow one and that wasn't even the first xbox so grow up nintendo or you will lose all your remaining graphics and have to put master chef ini smash brothers 2. finaly my most favorite game for the playstation is the quall of duty titles featuring james quall as mandrake the king of anubis. Too bad the new mirobox cant even run virtual 4k because it cant even have the graphics to supply the powerstrip that powers the metaldrive on this bad boy. In conclusion mricosroft will always be last place when it comes to graphics but really Nintendo is ess graphics than mricosroft so thank you for 4kgraphicsp.
       
    2. amli
      Offline

      amli Boss Member

      Joined:
      Apr 13, 2016
      Messages:
      4,566
      Likes Received:
      1,691
      its definitely fake.
       
    3. Acceptation
      Offline

      Acceptation ❤️ Discord Moderator Premium

      Joined:
      Jun 29, 2017
      Messages:
      8,019
      Likes Received:
      1,071
      Yeah........
       
    4. sircorgi
      Offline

      sircorgi Boss Member

      Joined:
      Nov 28, 2014
      Messages:
      5,642
      Likes Received:
      2,398
      i literally had to google what you were talking about but i still couldn't understand

      is this something to do with minecraft being available in 4k or something
       
    5. Zulfqar
      Offline

      Zulfqar Well-Known Member

      Joined:
      Nov 26, 2014
      Messages:
      1,495
      Likes Received:
      444
      While GPUs were originally designed as specialized processors optimized to render millions of pixels required for simulating 3D environments, repurposing GPUs to train artificial intelligence algorithms has been commonplace for a while.
      But it wasn’t until I read Andrej Karpathy’s recent post on reinforcement learning, however, that something clicked about how interesting this is:
      Graphics cards, originally designed for human vision of video games, are now being used for computer “vision” of video games.
      When I was growing up, getting a graphics card was kind of a Big Deal. The first one I got for Christmas in 1997 was a Pure3D Canopus Voodoo card based on the 3dfx chipset. It let me run Quake smoothly on my Pentium Compaq, which was a top priority of my life at the time.
      Ever since then, I’ve always thought GPUs were an interesting innovation driven by gamers. And it turns out I wasn’t alone: everyone from BitCoin miners to artificial intelligence programmers to medical researchers have re-purposed them for novel applications.
      In terms of neural networks, Karpathy points out something important: they aren’t really seeing or behaving like a human (hence the scare quotes) since they don’t actually understand the game. They’re merely brute-forcing a gameplay strategy by tracking pixels representing the ball and a player’s score going up.
      And video games are really just meant as toy examples to prove machines can learn to perform in complex environments. If a machine can learn how to play a video game, the thinking goes, then it can learn how to navigate game-like scenarios in the real world such as picking up objects or behaving in a goal oriented way to complete a task.
      Putting aside the question of whether reinforcement learning could ever achieve human or super-human intelligence in the real world, there’s something poetic about hardware originally designed to optimize a human’s experience being used to optimize a computer’s experience. We have repurposed hardware originally designed for outputting video games into hardware that can supply the input for a video game.
      An ungenerous way to characterize this observation is that GPUs are just powerful hardware thrown at a large set of tedious math problems. And indeed, traditional computer processors (also know as CPUs) are perfectly capable of doing the math required to train neural networks. They’re just a lot slower.
      To understand why GPUs are so much faster than CPUs for this type of work, it’s worth getting an intuition about how GPUs are different from CPUs. In his Book of Shaders, Patricio Gonzalez Vivo has a great explanation of the differences between normal processors (computer processing units, or CPUs) and GPUs.
      He suggests envisioning a CPU as a big industrial pipe which is capable of only doing one task at a time:
      Sketch by Patricio Gonzalez Vivo
      … versus how a GPU performs tasks in parallel:
      Sketch by Patricio Gonzalez Vivo
      Specialized software like Google’s TensorFlow or Theano enable programmers to leverage this massive parallelism for things like training neural networks with batches of data.
      Play drives innovation
      That computer scientists have opportunistically repurposed gaming hardware for artificial intelligence applications shouldn’t be surprising, as there’s a much richer history of games cultivating innovative technology. Games have influenced everything from button design to social networks in areas far afield from where they began in the game industry. Remember when the Department of Defense built a supercomputer using 2,200 PlayStation 3s?
      In the case of GPUs, industry leaders have already seized upon this opportunity and invested heavily. NVIDIA, a company originally focused on manufacturing consumer-level 3D graphics cards, developed CUDA, a platform for exploiting “off-label” usage of GPUs. They’re now selling a line of high performance data-focused GPU-based cards that aren’t even capable of outputting video:
      Look ma, no ports!
      Facebook has taken NVIDIA’s cards even farther and open sourced a server design called Big Sur which can host 8 GPU cards for a rack unit designed specifically for training neural networks:
      You can even rent GPU hardware for yourself by the hour on Amazon AWS: their EC2 GPU instances are designed to run NVIDIA’s CUDA code off the shelf and can be launched as part of a cluster of machines designed to train neural networks.
      So while technology like the GPU was originally seeded by demand from gamers, it now has massive applicability to other fields.
      But why did GPUs evolve in such a useful way? It isn’t just because game developers just wanted beefier hardware. It’s because game developers created applications that needed a different kind of hardware. GPUs were designed to solve millions of complex problems in parallel, something traditional computers weren’t great at.
      It’s not a coincidence, then, that GPUs designed for a computationally intensive task like 3D simulation could also be applied to another computationally intensive problem, like training a neural network.
      The complexity and success of GPU hardware is linked to the complexity and success of its software — games.
      Powerful GPUs didn’t arrive overnight or for free: there was a huge community of hungry gamers itching to buy the fastest hardware so their games would run even faster. So gamers’ demand for better hardware effectively financed the future of distributed processing power.
      But why did that happen? Why have games become such powerful drivers of innovation and culture?
      Cultural theorist Johan Huizinga’s Homo Ludens’ theory of the importance of play to our humanity argues that games are necessary to our culture and society.
      GPUs represent our collective investment in attending to that necessity: we financed and developed hardware to satisfy our need to play.
      And since that need is so fundamental to our humanity, that hardware turns out to be useful for solving other higher order “human” problems like image recognition, speech detection, and playing the games themselves.
       
    6. sircorgi
      Offline

      sircorgi Boss Member

      Joined:
      Nov 28, 2014
      Messages:
      5,642
      Likes Received:
      2,398
      okay we didn't need an entire website copy and pasted here that wasn't even edited to get rid of captions and things that don't even make sense
       
    7. Acceptation
      Offline

      Acceptation ❤️ Discord Moderator Premium

      Joined:
      Jun 29, 2017
      Messages:
      8,019
      Likes Received:
      1,071
      Am I literally the only one who has no idea what is going on here?

       
    8. Grooties
      Offline

      Grooties Active Member

      Joined:
      Oct 4, 2017
      Messages:
      213
      Likes Received:
      5
      After reading the entire essay about graphics i forgot what we were even talking about.
       
    9. BertBerry
      Offline

      BertBerry Experienced Member

      Joined:
      Mar 29, 2016
      Messages:
      175
      Likes Received:
      58
      I can't bother reading these essays
       
    10. Zulfqar
      Offline

      Zulfqar Well-Known Member

      Joined:
      Nov 26, 2014
      Messages:
      1,495
      Likes Received:
      444
      Today, I intended to write an essay on laziness, but I was too indolent to do so.

      The sort of thing I had in mind to write would have been exceedingly persuasive. I intended to discourse a little in favor of a greater appreciation of indolence as a benign factor in human affairs.

      It is my observation that every time we get into trouble, it is due to not having been lazy enough. Unhappily, we were born with a certain fund of energy. We have been hustling about for a number of years now, and it does not seem to get us anything but tribulation. Henceforward, we are going to make a determined effort to be more languid and demure. It is the bustling man or woman who gets put on committees, who is asked to solve the problems of other people, and neglect his or her own.

      The man or woman who is thoroughly and philosophically slothful is the only thoroughly happy person. It is the happy person who benefits the world. The conclusion is inescapable.

      I remember a saying about the meek inheriting the earth. The truly meek person is a lazy person. He or she is too modest to believe that any ferment and hubbub of his or hers can ameliorate the earth or assuage the perplexities of humanity.

      O. Henry said once that one should be careful to distinguish laziness from dignified repose. Alas, that was a mere quibble. Laziness is always dignified, and it is always reposeful. Philosophical laziness, I mean. The kind of laziness that is based upon a carefully-reasoned analysis of experience: acquired laziness. We have no respect for those who were born lazy; it is like being born a millionaire: they cannot appreciate their bliss. It is the person who has hammered his or her laziness out of the stubborn material of life for whom we chant praise and alleluia.

      The laziest man we know—we do not like to mention his name, as the brutal world does not yet recognize sloth at its community value—is one of the greatest poets in this country; one of the keenest satirists; one of the most rectilinear thinkers. He began life in the customary hustling way. He was always too busy to enjoy himself. He became surrounded by eager people who came to him to solve their problems. “It’s a queer thing,” he said sadly, “no one ever comes to me asking for help in solving my problems.” Finally the light broke upon him. He stopped answering letters, buying lunches for casual friends and visitors from out of town, he stopped lending money to old college pals and frittering his time away on all the useless minor matters that pester the good-natured. He sat down in a secluded café with his cheek against a seidel of dark beer and began to caress the universe with his intellect.

      The most damning argument against the Germans is that they were not lazy enough. In the middle of Europe, a thoroughly-disillusioned, indolent and delightful old continent, the Germans were a dangerous mass of energy and bumptious push. If the Germans had been as lazy, as indifferent, and as righteously laissez-fairish as their neighbors, the world would have been spared a great deal.

      People respect laziness. If you once get a reputation for complete, immovable, and reckless indolence, the world will leave you to your own thoughts, which are generally rather interesting.

      Doctor Johnson, who was one of the world’s great philosophers, was lazy. Only yesterday, our friend the Caliph showed us an extraordinarily interesting thing. It was a little leather-bound notebook in which Boswell jotted down memoranda of his talks with the old doctor. These notes he afterward worked up into the immortal Biography. And lo and behold, what was the very first entry in this treasured little relic?

      Doctor Johnson told me in going to Ilam from Ashbourne, 22 September, 1777, that the way the plan of his Dictionary came to be addressed to Lord Chesterfield was this: he had neglected to write it by the time appointed. Dodsley suggested a desire to have it addressed to Lord C. Mr. J. laid hold of this as an excuse for delay, that it might be better done perhaps, and let Dodsley to have his desire. Mr. Johnson said to his friend, Doctor Bathurst: “Now if any good comes of my addressing to Lord Chesterfield it will be ascribed to deep policy and address, when, in fact, it was only a casual excuse for laziness.” Thus, we see that it was sheer laziness that led to the greatest triumph of Doctor Johnson’s life, the noble and memorable letter to Chesterfield in 1775.

      Mind your business is a good counsel, but mind your idleness also. It is a tragic thing to make a business of your mind. Save your mind to amuse yourself with.

      The lazy person does not stand in the way of progress. When he or she sees progress roaring down upon him or her, he or she steps nimbly out of the way. The lazy person does not (in the vulgar phrase) pass the buck. He or she lets the buck pass him or her. We have always secretly envied our lazy friends. Now we are going to join them. We have burned our boats or our bridges, or whatever it is that one burns on the eve of a momentous decision.
       

    Share This Page