AJsAWiz
Jun 13, 06:12 PM
I'm not letting AT&T off easily, but I still argue that half of the problem is the iPhone itself. When I'm the only person with an iPhone and everyone else around me is on old cell phones on the same network and they have 5 bars and I have no signal, there's a problem.
Are those other phones accessing the 3G network? I carried a non 3G network AT&T phone around with me and experienced none of the signal problems I had with my iPhone in the same areas.
Are those other phones accessing the 3G network? I carried a non 3G network AT&T phone around with me and experienced none of the signal problems I had with my iPhone in the same areas.
heyisa
Sep 20, 11:53 AM
I'd rather wait for a mac mini w/iTV combo,
that would allow you to stream Bonjour content as well.
(could you imagine that in a dorm network!).
I think the second generation of this will be awesome, if apple does it screw it up.
I hope you could also use it as a seperate monitor for a computer.
Would make it really easy to hook up a computer to a projector that way.
that would allow you to stream Bonjour content as well.
(could you imagine that in a dorm network!).
I think the second generation of this will be awesome, if apple does it screw it up.
I hope you could also use it as a seperate monitor for a computer.
Would make it really easy to hook up a computer to a projector that way.
eawmp1
Mar 13, 10:08 AM
More people have died in hydroelectric or coal generated power production. Nuclear is relatively safe and clean.
Apple OC
Apr 24, 12:00 AM
For what it's worth, I don't think you're an idiot.
You simply made a statement that I'm not willing to make.
I make the statement because that is how I see things ... as I said there is not even remote evidence that there are Gods or that there ever were.
Science has given me very logical and believable answers as to how life formed on Earth.
I am not one that is still searching for answers. ... some so called Atheists are hoping for the proof that there is or is not a God. ... Science has already given me all the proof I need.
You simply made a statement that I'm not willing to make.
I make the statement because that is how I see things ... as I said there is not even remote evidence that there are Gods or that there ever were.
Science has given me very logical and believable answers as to how life formed on Earth.
I am not one that is still searching for answers. ... some so called Atheists are hoping for the proof that there is or is not a God. ... Science has already given me all the proof I need.
BoyBach
Aug 29, 03:36 PM
Greenpeace is nothing but a group of eco-terriests in my opinion.
Is that a logical or an emotional statement?
Is that a logical or an emotional statement?
Trash Can
Jun 19, 02:10 PM
I got the iPhone 3G two years ago after being with Verizon for seven. Love the phone - hate the network! I travel extensively. Some places my 3G is reliable, others not so much.
The iPhone cannot be beat in terms of engineering, user interface, and elegance. iPhone 4 appears to raise the bar even further. However, a phone is only as good as the network that it's on. As much as I love the thing, many times it functions more like an iPod Touch than a phone. With the introduction of the iPad 3G and influx of new iPhone 4 users, I believe AT&T's network issues will get worse before it gets better.
I've decided to leave AT&T, return to Verizon, and get a Droid. Is a Droid my device of choice? No, but at least I'll have reliability, something that I haven't had for the last two years. I've experienced more dropped calls with AT&T in one week than I have with Verizon over several years.
If and when Verizon ever gets the iPhone, I'll be in line with everyone else on launch day. Until then, I'm sure there will be days when I'm somewhat envious of those who purchased iPhone 4 (and possibly subsequent models), but at least I'll know I'll be able to make and receive calls reliably and without interruption.
The iPhone cannot be beat in terms of engineering, user interface, and elegance. iPhone 4 appears to raise the bar even further. However, a phone is only as good as the network that it's on. As much as I love the thing, many times it functions more like an iPod Touch than a phone. With the introduction of the iPad 3G and influx of new iPhone 4 users, I believe AT&T's network issues will get worse before it gets better.
I've decided to leave AT&T, return to Verizon, and get a Droid. Is a Droid my device of choice? No, but at least I'll have reliability, something that I haven't had for the last two years. I've experienced more dropped calls with AT&T in one week than I have with Verizon over several years.
If and when Verizon ever gets the iPhone, I'll be in line with everyone else on launch day. Until then, I'm sure there will be days when I'm somewhat envious of those who purchased iPhone 4 (and possibly subsequent models), but at least I'll know I'll be able to make and receive calls reliably and without interruption.
milo
Jul 13, 11:17 AM
Apple will offer a New Form Factor 64-bit Dual-Core Conroe Mini-Tower whether or not a single chip Woodie is in the lineup. They'll have no choice.
Not necessarily. They could also just put the conroe in the base model with the same form factor, although they probably wouldn't be able to get it as cheap. I don't really care if they go with the mini form factor or not as long as the price is low enough.
the single xeon configs i was refering to were netburst based ones.
(snip)
apple tried the powermac mini as it were and you did not buy it, it was called the g4 cube.
That's a $300 difference in list price. Even if apple pays half of that, it's a significant amount, not to mention that the difference goes higher the more ram you buy.
Sure, it makes sense for companies to offer a single woodcrest config IN ADDITION to conroe configs. It mostly makes sense for users who want to add the second chip themselves in the future. But all those companies also will sell conroe configs, and they will be cheaper. It just doesn't make sense to sell single woodcrest as a substitute for conroe, apple would likely be the only company doing that.
And the cube failed because it was simply outrageously overpriced (I would NOT consider it "powermac" by any stretch of the imagination, but it still cost almost as much as the full towers). They brought it back as the mini which has sold very well and demonstrated that people DO want smaller, cheaper alternatives.
Not necessarily. They could also just put the conroe in the base model with the same form factor, although they probably wouldn't be able to get it as cheap. I don't really care if they go with the mini form factor or not as long as the price is low enough.
the single xeon configs i was refering to were netburst based ones.
(snip)
apple tried the powermac mini as it were and you did not buy it, it was called the g4 cube.
That's a $300 difference in list price. Even if apple pays half of that, it's a significant amount, not to mention that the difference goes higher the more ram you buy.
Sure, it makes sense for companies to offer a single woodcrest config IN ADDITION to conroe configs. It mostly makes sense for users who want to add the second chip themselves in the future. But all those companies also will sell conroe configs, and they will be cheaper. It just doesn't make sense to sell single woodcrest as a substitute for conroe, apple would likely be the only company doing that.
And the cube failed because it was simply outrageously overpriced (I would NOT consider it "powermac" by any stretch of the imagination, but it still cost almost as much as the full towers). They brought it back as the mini which has sold very well and demonstrated that people DO want smaller, cheaper alternatives.
iMeowbot
Jul 11, 10:25 PM
As even AI note, there's not much difference between the two chips. This is about as exciting as finding out that a faucet will have a red handle if it runs hot water, blue if cold. Whee.
Sydde
Mar 15, 06:40 PM
Somewhere I think I read that Fukushima Dai-ichi was just a few months away from final retirement of the entire facility after twice its designed lifetime. But there almost certainly must be spent fuel rods in all the basins, since fuel changes are done at least as often as 18 months and spent fuel takes two to four years to cool enough to be safely moved offsite. The fuel still contains enough U-235 to produce considerable heat from just decay, but internal pollutants reduce its ability to contribute in a reactive core. Presumably, spent fuel is not considered to be able/likely to generate a critical event (neutron flux is too compromised by pollutants) so it would not require such sturdy containment as would a reactor.
To me, this operation looks slightly slipshod, almost like brinkmanship. Pushing nuclear systems even half way to their limits seems like too risky.
To me, this operation looks slightly slipshod, almost like brinkmanship. Pushing nuclear systems even half way to their limits seems like too risky.
Bosunsfate
Sep 12, 03:21 PM
So it seems from the coverage that the device has no optical drive, and no internal mass storage? Is that correct? And also that it is not itself a DVR? Don't get me wrong -- I'm reserving judgment. I just want to understand at this point. It sounds as if the basic purpose of the device is to draw high quality AV off a computer and onto a home entertainment system, sort of as the Roku SoundBridge did for the iPod's audio, but in a very Apple sort of way? In other words, it follows the computer-centric sort of model where a desktop or notebook Mac on the network is the "server"?
I would make the same quess as well.
Trying to get the QT stream, but overloaded right now.
I would make the same quess as well.
Trying to get the QT stream, but overloaded right now.
twochoicestom
Apr 13, 09:14 AM
aside from all of this..
HELVETICA is blatently coming to Lion. Looking good in FCP!
HELVETICA is blatently coming to Lion. Looking good in FCP!
myamid
Sep 12, 06:35 PM
Just because you can't see the difference between 480p and 720p doesn't mean that other people can't. I think this distinction is like night and day, but quality is subjective, I'll give you that.
Ok, I didn't see I didn't see it... but It's not enough to warrant 4GB extra download for a iTunes purchase... Let's put it that way :)
I'd take VERY good 480p versus mediocre 720p any day.
I apply that standard even today for HD DVD / BluRay... Movies in those 2 formats right now DO NOT warrant the extra expenditure... HD sure... on paper, but in practice, it's still not all it's cracked up to be.
On a sidenote, don't get me wrong, I can barely stand watching SD channels on TV these days... You get used to HD really quick... But I don't think the download/streaming market is "right" for HD content...
Ok, I didn't see I didn't see it... but It's not enough to warrant 4GB extra download for a iTunes purchase... Let's put it that way :)
I'd take VERY good 480p versus mediocre 720p any day.
I apply that standard even today for HD DVD / BluRay... Movies in those 2 formats right now DO NOT warrant the extra expenditure... HD sure... on paper, but in practice, it's still not all it's cracked up to be.
On a sidenote, don't get me wrong, I can barely stand watching SD channels on TV these days... You get used to HD really quick... But I don't think the download/streaming market is "right" for HD content...
jvegas
Sep 12, 03:55 PM
Will it support third party codecs?
Does it have an internal flash drive?
Will I be able to order Music, TV shows and Movies using it?
Do I need a separate computer to use it?
So far, I'm not impressed. How's it different than a media extender?
I would rather have seen a mac mini with core 2 duo, better graphics support, an internal 3.5" hard drive, and HDMI.
Does it have an internal flash drive?
Will I be able to order Music, TV shows and Movies using it?
Do I need a separate computer to use it?
So far, I'm not impressed. How's it different than a media extender?
I would rather have seen a mac mini with core 2 duo, better graphics support, an internal 3.5" hard drive, and HDMI.
techwarrior
Nov 12, 12:14 PM
Add me to the happy list. I have had all iPhones since 3G, and rarely lose a call, one or two places I typically go have poor service so I let others know I will call back if I drop in these spots. MCell has done wonders for the poor service at my home.
ATT is the only service I can get at work. Due to my office being an R&D facility for a company that makes phone systems they block all external wireless signals and then put ATT repeaters in the building.
So, for me, it would take a lot to push me over the edge to move to another provider. I do like how others are pushing ATT to adopt with more competitive plan options and think competition from TMo, Sprint/Nextel and Vz can only be good for those of us who can stay with ATT.
ATT is the only service I can get at work. Due to my office being an R&D facility for a company that makes phone systems they block all external wireless signals and then put ATT repeaters in the building.
So, for me, it would take a lot to push me over the edge to move to another provider. I do like how others are pushing ATT to adopt with more competitive plan options and think competition from TMo, Sprint/Nextel and Vz can only be good for those of us who can stay with ATT.
Huntn
Apr 25, 12:30 PM
Absolutely correct. It is irrelevant because it is unknowable so let's not pretend or imagine or try to know the unknowable. Let's live our lives in peace.
This takes responsibility away from what God would want, to what we think is right. I believe this to be a more realistic approach.
I certainly feel that most atheists are what I would call agnostic atheists. They lack belief in a god but leave the question of such a being existing either open and yet to be proved or unknowable and, therefore, pointless to contemplate. Only a so-called gnostic atheist would say they have seen sufficient evidence to convince them there is no god and I have not seen to many of them in my travels. It's more likely that they have yet to see sufficient evidence so, while they do not specifically believe in his existence, they cannot categorically deny it either. The blurry line between atheism and agnosticism is fairly crowded, I think.
It's easy "don't believe" as contrast to "don't know". I think it's a very important distinction for some Atheists who go beyond the "unknown" position into a more definitive negative view regarding deities. The problem as I see it is it is not so much that a deity may exist, it's all the purported rules and regs associated with said deity that makes it easy to cast doubt.
You've just made good points, Huntn. I'm sure that many, maybe even most, people have much the same knee-jerk reaction you have. I pointed out som distinctions, though, because nowadays, when many think unclearly, the ignore those distinctions. Each time I hear someone say "I feel" when he should say "I believe" or "I think," the phrase "I feel" reminds me of subjectivism.
Someone here, Lord Blackadder, I think, told me that I didn't understand the "pluralistic society" idea. I do understand it, and I know that many people disagree with me on many topics. I'm willing to learn from others. I even suspect that my false beliefs outnumber my true ones. But if disagreement among people proves anything, it proves that some people hold some false beliefs. If I believe that there's a God and you believe that there's no God, one of us is wrong. Today too many talk as though the freedom to believe what one wants to believe is more important than the truth.
Sure, it's often better to say "I don't know" rather than "I don't believe" because most people probably haven't learned the distinctions I've described. On the other hand, although knowing that a belief is true implies believing that it's true, believing that it's true doesn't imply knowing that it's true. If believing always implied knowing, everyone would be all-knowing.
Say I've deluded myself into believing that my honorary Brian is still living when he is, in fact, already dead. No one is helping me by saying that "Brian is still alive" is true for Bill but not for Brian's family." If I were deluded, the longer my delusion lasted, the more painful my disillusionment would be. I want to know the truth, even if it's unpleasant.
The problem is that the concept of God is subjective. And if any God exists, then 1)It is a horrible communicator or 2) It does not really care because if it did, it would rely on more than ancient scripts, and it would take more care to ensure those scripts were accurate. (They don't appear accurate to me).
We exist, there may be an afterlife. I really do hope there is a spiritual plane where consciousness may continue. And there maybe judgement but these are huge IFs mostly based on our desire that there is more to life than our meager existence on this planet.
For fun please judge this statement: God can't prove its existence. If anyone disagrees, what real proof would be required? I'm not talking about those very subjective "feelings". ;)
This takes responsibility away from what God would want, to what we think is right. I believe this to be a more realistic approach.
I certainly feel that most atheists are what I would call agnostic atheists. They lack belief in a god but leave the question of such a being existing either open and yet to be proved or unknowable and, therefore, pointless to contemplate. Only a so-called gnostic atheist would say they have seen sufficient evidence to convince them there is no god and I have not seen to many of them in my travels. It's more likely that they have yet to see sufficient evidence so, while they do not specifically believe in his existence, they cannot categorically deny it either. The blurry line between atheism and agnosticism is fairly crowded, I think.
It's easy "don't believe" as contrast to "don't know". I think it's a very important distinction for some Atheists who go beyond the "unknown" position into a more definitive negative view regarding deities. The problem as I see it is it is not so much that a deity may exist, it's all the purported rules and regs associated with said deity that makes it easy to cast doubt.
You've just made good points, Huntn. I'm sure that many, maybe even most, people have much the same knee-jerk reaction you have. I pointed out som distinctions, though, because nowadays, when many think unclearly, the ignore those distinctions. Each time I hear someone say "I feel" when he should say "I believe" or "I think," the phrase "I feel" reminds me of subjectivism.
Someone here, Lord Blackadder, I think, told me that I didn't understand the "pluralistic society" idea. I do understand it, and I know that many people disagree with me on many topics. I'm willing to learn from others. I even suspect that my false beliefs outnumber my true ones. But if disagreement among people proves anything, it proves that some people hold some false beliefs. If I believe that there's a God and you believe that there's no God, one of us is wrong. Today too many talk as though the freedom to believe what one wants to believe is more important than the truth.
Sure, it's often better to say "I don't know" rather than "I don't believe" because most people probably haven't learned the distinctions I've described. On the other hand, although knowing that a belief is true implies believing that it's true, believing that it's true doesn't imply knowing that it's true. If believing always implied knowing, everyone would be all-knowing.
Say I've deluded myself into believing that my honorary Brian is still living when he is, in fact, already dead. No one is helping me by saying that "Brian is still alive" is true for Bill but not for Brian's family." If I were deluded, the longer my delusion lasted, the more painful my disillusionment would be. I want to know the truth, even if it's unpleasant.
The problem is that the concept of God is subjective. And if any God exists, then 1)It is a horrible communicator or 2) It does not really care because if it did, it would rely on more than ancient scripts, and it would take more care to ensure those scripts were accurate. (They don't appear accurate to me).
We exist, there may be an afterlife. I really do hope there is a spiritual plane where consciousness may continue. And there maybe judgement but these are huge IFs mostly based on our desire that there is more to life than our meager existence on this planet.
For fun please judge this statement: God can't prove its existence. If anyone disagrees, what real proof would be required? I'm not talking about those very subjective "feelings". ;)
eric_n_dfw
Mar 20, 07:19 PM
But what if I got hold of that wedding video and decided to, I dunno, turn it into a music video for my own music... and that music video got onto MTV? No one is losing out on any money. No one is being hurt. I'm not stealing. I'm -merely- infringing copyright.
The videographer is being hurt, you and/or MTV have stolen the royalties they are due. (Asuming you are saying that it is someone else's video, not one that you shot and/or editted together.)
If it was produced by a videographer, they were probably smart enough to mark it with a copyright (you don't have to file anything to do so) and then they can sue you for that infringement because you are profitting off of his/her work. (Or, more likely, they'd sue Viacom for broadcast of their video without permission since they have the deeper pockets. But Viacom probably is imune because you signed a paper saying you owned said production - THEN they'd sue you.)
The theft in this is the result of the infringement. By admitting it's infringement, you are admitting that it's illegal. The only reason to copyright something is to protect your interests from those who would, well, infringe on them. :rolleyes:
The videographer is being hurt, you and/or MTV have stolen the royalties they are due. (Asuming you are saying that it is someone else's video, not one that you shot and/or editted together.)
If it was produced by a videographer, they were probably smart enough to mark it with a copyright (you don't have to file anything to do so) and then they can sue you for that infringement because you are profitting off of his/her work. (Or, more likely, they'd sue Viacom for broadcast of their video without permission since they have the deeper pockets. But Viacom probably is imune because you signed a paper saying you owned said production - THEN they'd sue you.)
The theft in this is the result of the infringement. By admitting it's infringement, you are admitting that it's illegal. The only reason to copyright something is to protect your interests from those who would, well, infringe on them. :rolleyes:
skunk
Apr 26, 05:20 PM
Have we just passed through the looking glass? :confused:
rasmasyean
Apr 22, 11:47 PM
It's believed that the Higgs Boson exists but as yet there is no proof of its existence. Despite this respected physicists continue to try and prove its existence.
There are many things we believe in the existence of despite lack of tangible proof.
The Higgs Boson is something that is speculated to exist based on mathematical models and observation of other properties in theory. Therefore they try to "look for it" in order to confirm their models.
Einstein's special relativity was also speculated to exist based on mathematical models. And there was no way to observe that and "prove" that those phenomenon exist until modern equipment was invented...like GPS.
Even when Einstein derived that light travels in "particles", it explained a lot of things, but it isn't really until now that we use "photons" to bombard atoms to do quantum mechanical work...like solar panels. But they were derived to exist based on some other doctrine that works in real life (not just your mind).
There is a line between using an established doctrine to determine something can exist vs. "faith" in something that exists with no basis to draw upon other than some book written thousands of years ago...presumably. That's why it's called "faith".
There are many things we believe in the existence of despite lack of tangible proof.
The Higgs Boson is something that is speculated to exist based on mathematical models and observation of other properties in theory. Therefore they try to "look for it" in order to confirm their models.
Einstein's special relativity was also speculated to exist based on mathematical models. And there was no way to observe that and "prove" that those phenomenon exist until modern equipment was invented...like GPS.
Even when Einstein derived that light travels in "particles", it explained a lot of things, but it isn't really until now that we use "photons" to bombard atoms to do quantum mechanical work...like solar panels. But they were derived to exist based on some other doctrine that works in real life (not just your mind).
There is a line between using an established doctrine to determine something can exist vs. "faith" in something that exists with no basis to draw upon other than some book written thousands of years ago...presumably. That's why it's called "faith".
r1ch4rd
Apr 22, 09:57 PM
And if over two thousand years from now people still believe in the Higgs Boson despite no evidence that it exists I'd likely be skeptical of their beliefs as well.
Hopefully we will find the answer soon enough because there are scientists working on both sides to prove and disprove the higgs boson and once we have it agreed one way or the other, we won't have many scientists preaching that you should have blind faith alone. The higgs boson is not going to be testing our loyalty!
The key thing for me that gives science credibility over religion is the ability to go back and revise your "beliefs" based on more recent findings or new understanding.
Hopefully we will find the answer soon enough because there are scientists working on both sides to prove and disprove the higgs boson and once we have it agreed one way or the other, we won't have many scientists preaching that you should have blind faith alone. The higgs boson is not going to be testing our loyalty!
The key thing for me that gives science credibility over religion is the ability to go back and revise your "beliefs" based on more recent findings or new understanding.
jaseone
Mar 19, 05:59 PM
I wish people would understand that this program is mainly created so that people who use Linux (don't know if you have heard of it, it has a larger market share than Mac OS X if I remember right :rolleyes: ) can listen to the music which they have purchased.
Uhm why is the program Windows only then???
Uhm why is the program Windows only then???
KnightWRX
May 2, 05:51 PM
Until Vista and Win 7, it was effectively impossible to run a Windows NT system as anything but Administrator. To the point that other than locked-down corporate sites where an IT Professional was required to install the Corporate Approved version of any software you need to do your job, I never knew anyone running XP (or 2k, or for that matter NT 3.x) who in a day-to-day fashion used a Standard user account.
Of course, I don't know of any Linux distribution that doesn't require root to install system wide software either. Kind of negates your point there...
In contrast, an "Administrator" account on OS X was in reality a limited user account, just with some system-level privileges like being able to install apps that other people could run. A "Standard" user account was far more usable on OS X than the equivalent on Windows, because "Standard" users could install software into their user sandbox, etc. Still, most people I know run OS X as Administrator.
You could do the same as far back as Windows NT 3.1 in 1993. The fact that most software vendors wrote their applications for the non-secure DOS based versions of Windows is moot, that is not a problem of the OS's security model, it is a problem of the Application. This is not "Unix security" being better, it's "Software vendors for Windows" being dumber.
It's no different than if instead of writing my preferences to $HOME/.myapp/ I'd write a software that required writing everything to /usr/share/myapp/username/. That would require root in any decent Unix installation, or it would require me to set permissions on that folder to 775 and make all users of myapp part of the owning group. Or I could just go the lazy route, make the binary 4755 and set mount opts to suid on the filesystem where this binary resides... (ugh...).
This is no different on Windows NT based architectures. If you were so inclined, with tools like Filemon and Regmon, you could granularly set permissions in a way to install these misbehaving software so that they would work for regular users.
I know I did many times in a past life (back when I was sort of forced to do Windows systems administration... ugh... Windows NT 4.0 Terminal Server edition... what a wreck...).
Let's face it, Windows NT and Unix systems have very similar security models (in fact, Windows NT has superior ACL support out of the box, akin to Novell's close to perfect ACLs, Unix is far more limited with it's read/write/execute permission scheme, even with Posix ACLs in place). It's the hoops that software vendors outside the control of Microsoft made you go through that forced lazy users to run as Administrator all the time and gave Microsoft such headaches.
As far back as I remember (when I did some Windows systems programming), Microsoft was already advising to use the user's home folder/the user's registry hive for preferences and to never write to system locations.
The real differenc, though, is that an NT Administrator was really equivalent to the Unix root account. An OS X Administrator was a Unix non-root user with 'admin' group access. You could not start up the UI as the 'root' user (and the 'root' account was disabled by default).
Actually, the Administrator account (much less a standard user in the Administrators group) is not a root level account at all.
Notice how a root account on Unix can do everything, just by virtue of its 0 uid. It can write/delete/read files from filesystems it does not even have permissions on. It can kill any system process, no matter the owner.
Administrator on Windows NT is far more limited. Don't ever break your ACLs or don't try to kill processes owned by "System". SysInternals provided tools that let you do it, but Microsoft did not.
All that having been said, UAC has really evened the bar for Windows Vista and 7 (moreso in 7 after the usability tweaks Microsoft put in to stop people from disabling it). I see no functional security difference between the OS X authorization scheme and the Windows UAC scheme.
UAC is simply a gui front-end to the runas command. Heck, shift-right-click already had the "Run As" option. It's a glorified sudo. It uses RDP (since Vista, user sessions are really local RDP sessions) to prevent being able to "fake it", by showing up on the "console" session while the user's display resides on a RDP session.
There, you did it, you made me go on a defensive rant for Microsoft. I hate you now.
My response, why bother worrying about this when the attacker can do the same thing via shellcode generated in the background by exploiting a running process so the the user is unaware that code is being executed on the system
Because this required no particular exploit or vulnerability. A simple Javascript auto-download and Safari auto-opening an archive and running code.
Why bother, you're not "getting it". The only reason the user is aware of MACDefender is because it runs a GUI based installer. If the executable had had 0 GUI code and just run stuff in the background, you would have never known until you couldn't find your files or some chinese guy was buying goods with your CC info, fished right out of your "Bank stuff.xls" file.
That's the thing, infecting a computer at the system level is fine if you want to build a DoS botnet or something (and even then, you don't really need privilege escalation for that, just set login items for the current user, and run off a non-privilege port, root privileges are not required for ICMP access, only raw sockets).
These days, malware authors and users are much more interested in your data than your system. That's where the money is. Identity theft, phishing, they mean big bucks.
Of course, I don't know of any Linux distribution that doesn't require root to install system wide software either. Kind of negates your point there...
In contrast, an "Administrator" account on OS X was in reality a limited user account, just with some system-level privileges like being able to install apps that other people could run. A "Standard" user account was far more usable on OS X than the equivalent on Windows, because "Standard" users could install software into their user sandbox, etc. Still, most people I know run OS X as Administrator.
You could do the same as far back as Windows NT 3.1 in 1993. The fact that most software vendors wrote their applications for the non-secure DOS based versions of Windows is moot, that is not a problem of the OS's security model, it is a problem of the Application. This is not "Unix security" being better, it's "Software vendors for Windows" being dumber.
It's no different than if instead of writing my preferences to $HOME/.myapp/ I'd write a software that required writing everything to /usr/share/myapp/username/. That would require root in any decent Unix installation, or it would require me to set permissions on that folder to 775 and make all users of myapp part of the owning group. Or I could just go the lazy route, make the binary 4755 and set mount opts to suid on the filesystem where this binary resides... (ugh...).
This is no different on Windows NT based architectures. If you were so inclined, with tools like Filemon and Regmon, you could granularly set permissions in a way to install these misbehaving software so that they would work for regular users.
I know I did many times in a past life (back when I was sort of forced to do Windows systems administration... ugh... Windows NT 4.0 Terminal Server edition... what a wreck...).
Let's face it, Windows NT and Unix systems have very similar security models (in fact, Windows NT has superior ACL support out of the box, akin to Novell's close to perfect ACLs, Unix is far more limited with it's read/write/execute permission scheme, even with Posix ACLs in place). It's the hoops that software vendors outside the control of Microsoft made you go through that forced lazy users to run as Administrator all the time and gave Microsoft such headaches.
As far back as I remember (when I did some Windows systems programming), Microsoft was already advising to use the user's home folder/the user's registry hive for preferences and to never write to system locations.
The real differenc, though, is that an NT Administrator was really equivalent to the Unix root account. An OS X Administrator was a Unix non-root user with 'admin' group access. You could not start up the UI as the 'root' user (and the 'root' account was disabled by default).
Actually, the Administrator account (much less a standard user in the Administrators group) is not a root level account at all.
Notice how a root account on Unix can do everything, just by virtue of its 0 uid. It can write/delete/read files from filesystems it does not even have permissions on. It can kill any system process, no matter the owner.
Administrator on Windows NT is far more limited. Don't ever break your ACLs or don't try to kill processes owned by "System". SysInternals provided tools that let you do it, but Microsoft did not.
All that having been said, UAC has really evened the bar for Windows Vista and 7 (moreso in 7 after the usability tweaks Microsoft put in to stop people from disabling it). I see no functional security difference between the OS X authorization scheme and the Windows UAC scheme.
UAC is simply a gui front-end to the runas command. Heck, shift-right-click already had the "Run As" option. It's a glorified sudo. It uses RDP (since Vista, user sessions are really local RDP sessions) to prevent being able to "fake it", by showing up on the "console" session while the user's display resides on a RDP session.
There, you did it, you made me go on a defensive rant for Microsoft. I hate you now.
My response, why bother worrying about this when the attacker can do the same thing via shellcode generated in the background by exploiting a running process so the the user is unaware that code is being executed on the system
Because this required no particular exploit or vulnerability. A simple Javascript auto-download and Safari auto-opening an archive and running code.
Why bother, you're not "getting it". The only reason the user is aware of MACDefender is because it runs a GUI based installer. If the executable had had 0 GUI code and just run stuff in the background, you would have never known until you couldn't find your files or some chinese guy was buying goods with your CC info, fished right out of your "Bank stuff.xls" file.
That's the thing, infecting a computer at the system level is fine if you want to build a DoS botnet or something (and even then, you don't really need privilege escalation for that, just set login items for the current user, and run off a non-privilege port, root privileges are not required for ICMP access, only raw sockets).
These days, malware authors and users are much more interested in your data than your system. That's where the money is. Identity theft, phishing, they mean big bucks.
GGJstudios
May 2, 02:51 PM
I love how you all pretend like this is the first piece of intrusive software (Malware) for Macs ...
As already stated, it's not the first. Who's pretending that it is?
... like there's no such thing as a virus for Mac...
For Mac OS X, there isn't.
As already stated, it's not the first. Who's pretending that it is?
... like there's no such thing as a virus for Mac...
For Mac OS X, there isn't.
Multimedia
Oct 28, 03:07 PM
OK, so I now know what the potential capabilities of the new machines will have. If I look at the Apple Store and see the 3 current base options & price, when the release occurs, what is the speculation of choices & prices?
I am also wanting to know that if I have decided that the current 2.66 GHz meets my needs, should I hold off because they may bump the speed, lower the price, etc., etc. I also understand that everything is pure speculation. I am also not wanting to shoot myself because something else happens to the current line up.
I appreciate the thorough & in-depth responses. It helps.This is a fairly short thread. All your questions and answers have been discussed in depth above. You should wait in case there is more base RAM to 2GB since that's the new base in MacBook Pros.
Figure Plus $800-$1400 for the 8-core
I am also wanting to know that if I have decided that the current 2.66 GHz meets my needs, should I hold off because they may bump the speed, lower the price, etc., etc. I also understand that everything is pure speculation. I am also not wanting to shoot myself because something else happens to the current line up.
I appreciate the thorough & in-depth responses. It helps.This is a fairly short thread. All your questions and answers have been discussed in depth above. You should wait in case there is more base RAM to 2GB since that's the new base in MacBook Pros.
Figure Plus $800-$1400 for the 8-core
I'mAMac
Oct 29, 10:08 AM
I heard somewhere that the Clovertowns are actually slower than the Xeons, but with 2x as many cores will there be much difference?