And professionals are immune to malware? In that case I'm a pro too.
Announcement
Collapse
No announcement yet.
Console Wars IV
Collapse
X
-
Asher i professional in all things possible. That's what you'd think when reading his posts at least.Do not fear, for I am with you; Do not anxiously look about you, for I am your God.-Isaiah 41:10
I praise you because I am fearfully and wonderfully made - Psalms 139.14a
Also active on WePlayCiv.
Comment
-
Originally posted by Nikolai
Asher i professional in all things possible. That's what you'd think when reading his posts at least.
It is true that I usually only post on things I know about. I would suggest you do the same, but then you'd never post."The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
Ben Kenobi: "That means I'm doing something right. "
Comment
-
Originally posted by Wiglaf
And professionals are immune to malware? In that case I'm a pro too."The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
Ben Kenobi: "That means I'm doing something right. "
Comment
-
Originally posted by DrSpike
No, professionals are both knowledgable and balanced."The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
Ben Kenobi: "That means I'm doing something right. "
Comment
-
Originally posted by Jon Miller
Originally posted by Kuciwalker
Isn't that called Harvest Moon? It's actually been pretty successful.
I like Harvest Moon.
I'm disappointed there isn't a computer version.(\__/) Save a bunny, eat more Smurf!
(='.'=) Sponsored by the National Smurfmeat Council
(")_(") Smurf, the original blue meat! © 1999, patent pending, ® and ™ (except that "Smurf" bit)
Comment
-
Originally posted by Agathon
The fact that it requires newish hardware to run graphics that don't look much better than OS X circa 2001.
It runs like a pig unless you have a pretty decent machine.
The UAC thing that Wiglaf is complaining about has bothered a lot of people. It's no good saying "It's OK for me" because it is pissing off a significant portion of users.
The fact that it is basically XP with nicer graphics for most people. [...] As I said, the general impression from reviewers has been middling to bad. That's not good enough.
And then there are some pros who don't like Vista because they don't think, for whatever reason, its a good tool for doing their job. (I know some) Maybe they have a point, but who cares? Certainly not the average user like you and me.
Even though Vista isn't groundbreaking, even though some of its features pisses off some pros, the fact remains that its a very good OS for the average user (and it's definitely better than XP).
Microsoft still can't design a user interface (mind you, they can't seem to design anything).
Maybe I'll buy a Mac one of these days, just for the hell of it. I'll experience first hand what you've been fussing about all those years.Let us be lazy in everything, except in loving and drinking, except in being lazy – Lessing
Comment
-
The main problem with Vista is the image it has, largely because of Apple's marketing campaign. I'd read an article the other day that explains Vista pretty well:
From my perspective, Vista faces two major issues. Clearly, there is the image problem. For the last 18 months, Vista has been getting poor press, and the loudest marketing has been the negative stuff coming from Apple.
Brooks acknowledged Apple's impact and said the "sleeping giant" had woken up and hinted at the company's forthcoming $300 million multiyear marketing push.
But the second issue, which is beyond the image problem, is what I'd call the operating system's dessert-to-vegetable ratio. Many of Vista's changes are under the hood. They were necessary things like improved security, a new graphics engine and driver model. Those are like veggies. You have to eat them, but you are going to have a tough time getting people to flock to the table.
Although Vista has some tasty treats, like better photo handling and built-in desktop search, its new features haven't exactly taken the world by storm. I doubt I'm saying anything the Windows team hasn't already realized--but the next time they come out with a new OS, they would be well-served to have three or four drool-inducing features that motivate people to get a new PC or upgrade their old one.
The company has taken a step in the right direction in announcing that Windows 7 won't make any major architectural changes (less veggies), but they need to make sure that their entree is appetizing and that the dessert is top-notch.
This is what I've been saying here since before it even launched. Vista's a far bigger upgrade than most people, least of all detractors like Agathon, understand. Microsoft's in a precarious position where they need to preserve backwards compatibility while introducing new features all without pissing off existing users with huge changes. For instance, I love the new Office 2007 GUI but a lot of businesses and a lot of users push back on it heavily. It is without a doubt a superior interface, but it's different -- and different is bad for a massive portion of Windows users.
So what MS did with Vista is sort of keep the genuine look and feel of Windows as people know it, then laid the groundwork (or framework, in this case) for the next generation of software which works and looks very different from today's interfaces and systems.
There are a staggering amount of changes in Vista's innards. Starting from the ground up -- the kernel in Vista is easily the most advanced and scalable of any kernel ever made. A single system (not a distributed system!) can have tens of thousands of processors and the kernel will distribute its workload evenly across those systems. Nothing else can come close to doing that -- Darwin in OS X, any BSD distro, Linux, or any UNIX. It's got extremely advanced memory management and allocation features as well that'll come in handy as we start introducing parallel processors in add-in boards with their own memory pools (like Nvidia's CUDA has started doing). The entire driver architecture has changed from the ground up. The desktop presentation engine is all new and fully running off the graphics card, something OS X doesn't even do yet.
Then there's the new APIs -- WPF, WWF, WCF, etc are all new stacks of technology that really fundamentally change how Windows applications are designed and programmed. Win32 still exists and is perfectly usable, which is what 99.9% of software still use, but the core of Vista's new functionality is in the "WinFX" APIs that are just now starting to get used.
The change to Vista is not unlike the change from OS 9 to OS X. MS just had the resources to bridge the gap so it wasn't jarring, something Apple could not do. The downside is uninformed people, which unfortunately makes up the majority of the population, just look at the superficial and say "it looks like XP with a new paint job".
Ultimately, it doesn't matter what those people think. People are going to use Vista, whether they like it or not. If not in Vista, they'll use it in Windows 7. Vista just laid the groundwork for the next generation of operating systems, and it's starting to get a little frustrating to see idiots online without a clue how OSes work or what's genuinely new in Vista ***** about Vista being a "trainwreck". Frankly, anyone who says such a thing is either a troll or a minion of Jobs or Torvalds' armies. They're not realists, in any case.Last edited by Asher; July 9, 2008, 18:23."The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
Ben Kenobi: "That means I'm doing something right. "
Comment
-
Let's continue this discussion here: http://www.apolyton.net/forums/showt...hreadid=178847"The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
Ben Kenobi: "That means I'm doing something right. "
Comment
-
Originally posted by Nostromo
It is a problem, I'll grant you that. (Not on my PC, mind you, since I have a decent CPU) That said, a lot of early Mac OS X reviews also complained that it ran like a pig.
Apple has been optimizing the code for about 7 years now and I'm pretty sure you still need a decent machine to run it. BTW, I doubt that Leopard has substantially lower system requirements than Vista. I don't see Leopard running on a Eee anytime soon. Vista is supposed to get faster with SP1. (For what its worth, I have Vista SP1 and its quite fast, not sluggish at all.) I suppose it'll get better with SP2 and SP3.
I ran every version of OS X that was supported on my old mac, and while 10.3 was getting towards the limit it was still usable.
At first glance, it sure looks like XP with a fresh coat of paint. But if you use it for a while, you'll discover a lot of nice touches that improve usability. 90% of the little things that irritated me about XP just disappeared. And if you ask me, most of the bad reviews come from people who were expecting something groundbreaking. Of course, if you look at it from the average user's perspective, Vista isn't groundbreaking at all.
I don't see any major problem in the XP UI. And Vista improves on it quite a bit. I like the breadcrumb navigation, for example. I don't think they invented it, but who cares its there and it works. And I don't know if you tried Office 2007, but the UI is very nice.
Maybe I'll buy a Mac one of these days, just for the hell of it. I'll experience first hand what you've been fussing about all those years.Only feebs vote.
Comment
-
Full of ****
Originally posted by Agathon
That's not true.Ars Technica. Power users and the tools they love, without computing religion. Oh yeah, did we mention we are unassailable computing enthusiasts.
Enough abuse. Let's look at how OS X performs under "normal use." It turns out to be a mixed bag. We've already seen the application launch picture: classic, good, OS X native, not-so-good. Readers who have followed our past coverage of OS X are undoubtedly curious about the performance pitfalls that have dogged OS X throughout its development process. I'm sad to say that, for the most part, they remain.
Window Resizing
The headliner is opaque window resizing. On my G3/400, it was unacceptably slow across the board. All opaque resize interactions required an attentiveness and time investment that was way beyond sane limits. The interaction was much like the aviation phenomenon of "chasing the needle": over-correcting in response to time-delayed instrument feedback. In OS X, it works like this: you grab the corner of a window and drag to make it smaller or larger. Visually, nothing happens, so you drag some more. Eventually, the display updates and you see that you've over-shot your goal size. You reverse your action to correct, and the cycle repeats, oscillating until you converge on your desired window size.
Compared to the "gray outline" abstraction used in classic Mac OS during window resizing (shown below), OS X's opaque window resizing is meant to give better visual feedback and increase the feeling of "direct manipulation." But the reality of the situation is that poor performance makes opaque resizing in Mac OS X less responsive with less of a feeling of direct manipulation than classic Mac OS's gray outlines.
But anyone who finds the performance "acceptable" has very low standards, in my opinion. Even on the dual G4/450, it was unbearably slow. And there's no denying the fact that it is much slower and less responsive than classic Mac OS's gray outlines. Again, keep in mind the goal of opaque resizing: better visual feedback and an increased feeling of direct manipulation. Mac OS X's opaque resizing fails on both counts, and would have been disabled if usability was the primary concern.
On the other hand, if you add "marketing" and the ability to perform gee-whiz demos to the picture, perhaps the feature can be better explained. While I recognize the value of such things, you have to draw a line in the sand somewhere. At the very least, Apple should have added a system-wide option to disable opaque resizing in favor of gray outlines.
The cause of this performance gaffe is probably only important to the Apple folks who have to fix it, but that hasn't stopped rampant speculation among Mac users. The current front-runner is the idea that many Quartz operations cannot be accelerated by current video card hardware, which is accustomed to simply streaming solid pixels to the screen. This feature, at least, is accelerated in OS X. But almost every other operation requires calculations by the CPU (and memory bus overhead to feed it) before the pixels to be drawn can be passed to the graphics card.
This model, at least, correctly predicts the observed behavior that windows with many child elements resize more slowly than windows with few. The poster-boy for slow resizing is the Finder list-view window, which is rife with child elements: column headers, rows, and cells. According to the theory, each of those child elements must be calculated by your poor, tired CPU before the pixels to be drawn can be sent to the graphics card. The more calculations, the slower the redraw.
Second generation display layers like Mac OS 9's that are only concerned with drawing solid pixels to the screen (with the occasional small-scale compositing calculation) are fully accelerated by today's 2D graphics hardware. Fully accelerating a third generation display layer like Quartz will require new, more powerful hardware. Already, there is talk of using the considerably more sophisticated 3D hardware on today's graphics cards to give Quartz a boost. No, this does not mean that window will suddenly become three-dimensional. It just means that all those compositing calculations that your CPU has to do today when running Mac OS X may be off-loaded to a super-powerful GPU on your 3D graphics card. My fingers are crossed.
Quartz
Aside from window resizing, the Quartz display layer performs admirably. That's not to say it performs as well as Mac OS 9's considerably simpler display layer, because it doesn't. Mac OS 9's display performance is undeniably faster on the same hardware. This it not surprising at all, given the amount of work done in comparison to Quartz, but it would be nice if Apple could make the performance gap a bit smaller.
Multitasking is where application performance really comes into play: how well does each application behave in this new environment? Classic applications continue to exist in their cooperative little world, of course. The only new wrinkle is that if the classic environment itself (a regular userland process, remember) is not getting enough CPU cycles, the applications running in classic will also be starved, and therefore unresponsive. As mentioned earlier, I saw an extreme case of this when installing the Developer Tools CD. Classic, more than any other OS X application, has hooks deep into the core OS, all the way down to the Mach kernel. As an apparent consequence, it seems to be the most susceptible to bad behavior in response to system load.
The next offender has no such defense. The OS X Finder is the undisputed king of unresponsiveness among the OS X native applications. We've already touched on the horrendous list-view resize problems, but wait, there's more. An unreasonable number operations "block" in the OS X Finder. By "block", I mean that the Finder is unresponsive during these operations. Almost anything that requires network access exhibits this behavior. If the mounting of an AppleShare volume or your iDisk takes 30 seconds, the Finder will be useless for 30 seconds. This is clearly an application shortcoming, not a fault of the OS, since many other applications do not exhibit these problems. It's a shame that such an important application is so afflicted.
The bundled 5.1 "preview" version of Internet Explorer is another performance dud. It takes forever to launch, has the usual assortment of window resizing and scrolling problems, often blocks when downloading files, and generally behaves worse than IE 5.0 running in the classic environment. After fighting with IE 5.1 for several hours, I gave up and pasted the pretty IE 5.1 icon onto my copy of IE 5.0 and used that for my web browsing instead. Its performance is superior in all respects.
QuickTime performance remains problematic on the G3. I was unable to play even a single copy of the large (588x440) version of the Ruby iMac television advertisement at the full framerate on the G3/400, even on a totally idle system. This movie plays just fine in Mac OS 9 on the same machine. The dual G4/450 does much better, happily playing two copies of this movie simultaneously at full speed on a lightly loaded system. Obviously, the G4's AltiVec unit helps QuickTime playback a great deal, but I'm still puzzled by the poor G3 performance, especially the disparity between playback in OS X versus Mac OS 9. This has been a problem throughout Mac OS X's development, and it's a shame to see that it's made it into the 10.0 release. Is Apple simply giving up on the G3 in favor of G4 optimizations?
One final popular performance metric: "will my MP3s skip?" Well, it depends. A Mac OS X native version of Apple's iTunes MP3 player/encoder application was released on March 24th, but is not included in the OS X box. It's a rough port that suffers from some major redraw slowdowns, and it doesn't yet support many features like CD burning and full-screen visualizations. Nevertheless, left minimized in a corner, it dutifully plays nearly skip-free MP3s.
I say "nearly" because I can indeed make it skip on my G3/400 by, say, grabbing a translucent terminal window and shaking it back and forth as fast as I can. (The tremendous CPU overhead of this action actually makes sense, given the graphics acceleration issues discussed earlier.) This trick doesn't work on the dual G4/450, but I was still able to make iTunes skip on that machine by exercising the classic environment (starting, stopping, using classic applications, etc.)
According to Apple's documentation, Mac OS X's Mach kernel boasts "real-time support" that "guarantees low-latency access to processor resources for time-sensitive media applications." I don't doubt the potential of such Mach features, but I have not seen an application that takes full advantage of these abilities. MP3 playback in the Mac OS X version of iTunes does not seem substantially more "skip-free" than its Mac OS 9 counterpart. CPU cycles are a finite resource, of course, but it seems to me that the full realization of totally skip-free playback could be achieved via Mach's real-time
capabilities. iTunes 2.0 on Mac OS X 10.5 perhaps?
Performance Summary
Mac OS X is slower than Mac OS 9 on the same hardware. The interface is less responsive overall. All classic applications take a minor speed hit. RAM usage is considerable due to the "double-OS" nature of the classic environment. Despite a superior VM system, OS X can and does get into trouble when the paging activity starts to build on systems with close to the minimum-required.
Mac OS X 10.0 suffers from a combination of these ailments. It's got the interface responsiveness and RAM hunger of the color/System 7 transition, and the legacy application speed hit of the PowerPC transition, but without the easy native application performance increase.
Is this a fatal combination, or will Mac OS X have a snappy UI and be running classic applications faster than any Mac OS 9 system every ran them, come 2003? Time will tell, but I think the performance issues alone may be reason enough not to use Mac OS X 10.0 on any system that does not ship with it pre-installed, unless you're an acknowledged early adopter.
MacOS X 10.0 was far worse than Vista was performance-wise when it was released. I'm so tired of you lying through your teeth."The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
Ben Kenobi: "That means I'm doing something right. "
Comment
-
[QUOTE] Originally posted by Mr Snuggles
I'm not sure if you know this, but computer science grads are in demand right now. Salaries are inflating at levels such that most grads don't want to settle for game developer salaries (myself included).
Some folks do. Obviously they get compensating advantages. There are always going to be folks at the margin, and if the salaries are high enough it will draw CS grads to the game industry. Can the FPS's cover those salaries - for sure some will. If salaries rise, though, some, at the margin, will be unprofitable and not be made. And salaries will be driven up by competing uses of course. Will the overall salary for CS grads (or even "good" comp sci grads) be significantly moved by the existence of the Wii casual games? More so than by projects in other industries?
Think numbers. Whats the total number of CS grads working in the game industry (lets include only US, Canada, West Europe and Japan, so we dont have to discuss 'easten european" code monkeys) on al platforms? For arguments sake we will assume that NO one will shift to gaming from other industries.
Now of that total in gaming, how many work for Nintendo? Of those lets deduct the number working on "core games" what is that as a proportion of total CS grads in gaming?
Now you may want to add some folks working on casual games at 3rd parties - but I wonder if all of those 3rd parties have the same hiring standards as Nintendo.
You're kidding yourself if by changing their focus to making lots of "wii fit" and "soup maker extra fun edition" type games, Nintendo isn't diverting resources from other areas as well.
We see it from a very fundamental level already. Do you think if Nintendo wasn't pandering to the 80 year old woman crowd, they wouldn't castrate their machine in specs like they did? The decision to go to the casual crowd for Nintendo hurt the more hardcore gamers before the console was even launched.
It hurt the hardcore gamers ON the Wii, because Nintendo was making only one non-handheld console. The lumpiness of HW SKUs is a different consideration from scarcity of developer time."A person cannot approach the divine by reaching beyond the human. To become human, is what this individual person, has been created for.” Martin Buber
Comment
Comment