Help me I’m stooopid.

Plus, when you have consumers too stupid to even make it on the short bus buying $800 Apple watches, a $1,600 top end GPU is relatively cheap.
I refuse to get a 'smart watch'. I spent more than $800 on my latest Rado watch, but it can last a lifetime and will never run out of power (automatic, so no battery). Plus it looks damned nice.

I have family members using these 'smart' watches. I don't feel the need to be THAT connected/available every moment I'm awake. NOT having one makes it easier for me to ignore things as needed. IME, far better for mental health...
 
I refuse to get a 'smart watch'. I spent more than $800 on my latest Rado watch, but it can last a lifetime and will never run out of power (automatic, so no battery). Plus it looks damned nice.

I have family members using these 'smart' watches. I don't feel the need to be THAT connected/available every moment I'm awake. NOT having one makes it easier for me to ignore things as needed. IME, far better for mental health...

I have a Rado, I love it, don’t wear it a lot since I don’t dress up for things like going to the office anymore. No problem with nice watches, I’m just making fun of an $800 Apple Watch.
 
Plus, when you have consumers too stupid to even make it on the short bus buying $800 Apple watches, a $1,600 top end GPU is relatively cheap.
You can spend less than that though and get 90% of the performance. Usually LoDR dictates the last 10-15% is stupidly expensive. Thats why i bought a 3080ti instead of a 3090, the difference was not worth, at the time, the $800 gap, although its probably a narrower gap now. Same thing with i7 vs i9.... the delta of difference is basically noise that I'm not gonna pay for 🤣 "oh noes i get only 90 fps at 1440p instead of 96 fps!!!!!!!!! Ooolyoooly oooly oh. 🤣 "
 
You can spend less than that though and get 90% of the performance. Usually LoDR dictates the last 10-15% is stupidly expensive. Thats why i bought a 3080ti instead of a 3090, the difference was not worth, at the time, the $800 gap, although its probably a narrower gap now. Same thing with i7 vs i9.... the delta of difference is basically noise that I'm not gonna pay for 🤣 "oh noes i get only 90 fps at 1440p instead of 96 fps!!!!!!!!! Ooolyoooly oooly oh. 🤣 "
That's why I bought 1310nm transcievers rather than 1550nm for the Hopkinton club.
 
You can spend less than that though and get 90% of the performance. Usually LoDR dictates the last 10-15% is stupidly expensive. Thats why i bought a 3080ti instead of a 3090, the difference was not worth, at the time, the $800 gap, although its probably a narrower gap now. Same thing with i7 vs i9.... the delta of difference is basically noise that I'm not gonna pay for 🤣 "oh noes i get only 90 fps at 1440p instead of 96 fps!!!!!!!!! Ooolyoooly oooly oh. 🤣 "

I don’t disagree. Your solution makes more sense. ESP since you can just swap out GPUs along the way. I’m not gonna do that. I’m just looking to future proof myself for as long as I can until that PC dies. Just like this one did.

I fully grasp this is fully dumb in a practical manner.

Although swapping GPUs down the road is easy. Even I can do that.

I stop at dealing with motherboard and chips.
 
You bring up a good point though… instead of max future proofing, why wouldn’t I aim slightly lower and just replace the GPU down the road?

I mean that can’t take long to do.

I can’t remember, there’s no water cooling bullshit on GPU I’d have to disconnect right?

Just pull from PCI slot and remove power cables, insert new card & attach power?
 
Have you determined the GPU to be the source of "no input to monitor" issue?
 
Have you determined the GPU to be the source of "no input to monitor" issue?

God no. Got it all disconnected, opened it up, made sure everything was connected internally correctly, went to garage to air compress all the pubes collected inside, reconnected….. and still nothing.

Bad GPU wouldn’t explain zero power to mouse and keyboard.
 
You bring up a good point though… instead of max future proofing, why wouldn’t I aim slightly lower and just replace the GPU down the road?

I mean that can’t take long to do.

I can’t remember, there’s no water cooling bullshit on GPU I’d have to disconnect right?

Just pull from PCI slot and remove power cables, insert new card & attach power?

Mostly…. assuming your PSU can take the new demands. Honestly in my current new rig I’m hoping to avoid even an in-line upgrade 5-6 yrs but who knows. I usually do one GPU upgrade per box-life but generationally the leaps are not that big anymore. Like going from my 1070 to 3080ti is a big step but not nearly as big as going from the 560ti to the 1070 was at the time.
 
Mostly…. assuming your PSU can take the new demands. Honestly in my current new rig I’m hoping to avoid even an in-line upgrade 5-6 yrs but who knows. I usually do one GPU upgrade per box-life but generationally the leaps are not that big anymore. Like going from my 1070 to 3080ti is a big step but not nearly as big as going from the 560ti to the 1070 was at the time.

Interesting…. This is intriguing. I mean the only argument on the opposite side is the difference between, say, a 3080ti vs 3090ti is a few hundred bucks. When you discount that over many years of ownership, it’s kind of insignificant?

But, then the argument back in my own face is the 3090ti today is one thing, but in 3 years the GPU series then will be significantly better.

Could just make sure I max out PSU unit for future upgrades. I mean that’s cheap relative.
 
Interesting…. This is intriguing. I mean the only argument on the opposite side is the difference between, say, a 3080ti vs 3090ti is a few hundred bucks. When you discount that over many years of ownership, it’s kind of insignificant?

But, then the argument back in my own face is the 3090ti today is one thing, but in 3 years the GPU series then will be significantly better.

or let’s put it in these terms the difference between a 3080 and a 3090 is likely not the difference between being able to play a decent game or not. Even at goosed settings. course somebody is going to produce an extreme example but there are some bleeding edge games that will destroy any GPU at impossible settings specially at 4K…. 🤣 but that just becomes retarded. that’s another reason I stuck with 1440 P because at least half I know I’m working off that, I’m going to get huge lifespan out of the GPU because I’m not chasing the 4K deluxe retardation shithouse parade. Which if you really dig into it is sort of a fool’s errand anyways. even the Console games pretty much cheat with rendering hacks/downscaling textures to obtain high FPS at 4k. Like I really need to see that female warrior’s genital hair precisely to get that full immersion!!!!!!🤣
 
Like going from my 1070 to 3080ti
no one makes new crysis game no more. it is questionable now badly needed a 3070ti not saying 3080ti will actually be. well, if one plays at 4K of course, but, all my monitors are 1440 and will remain so. so...
 
I have a Rado, I love it, don’t wear it a lot since I don’t dress up for things like going to the office anymore. No problem with nice watches, I’m just making fun of an $800 Apple Watch.
I wear my Rado Open Heart Automatic pretty much every day. I have another Rado in a case (battery powered) that was my first. Need to see about selling the Movado I don't think I'll ever wear again.
 
or let’s put it in these terms the difference between a 3080 and a 3090 is likely not the difference between being able to play a decent game or not. Even at goosed settings. course somebody is going to produce an extreme example but there are some bleeding edge games that will destroy any GPU at impossible settings specially at 4K…. 🤣 but that just becomes retarded. that’s another reason I stuck with 1440 P because at least half I know I’m working off that, I’m going to get huge lifespan out of the GPU because I’m not chasing the 4K deluxe retardation shithouse parade. Which if you really dig into it is sort of a fool’s errand anyways. even the Console games pretty much cheat with rendering hacks/downscaling textures to obtain high FPS at 4k. Like I really need to see that female warrior’s genital hair precisely to get that full immersion!!!!!!🤣

But, have you seen the shaved pussy in Witcher 3?

Aside…. All your points are convincing me. I don’t play anything that maxes out graphics to the point I see queef heat haze effects.
 
Just to add extra retardation against my own case, I’m not running 4K monitors.

So, this makes this convo even more obvious I went to Mass general for their gender reassignment clinic
 
here on Friday on my couch downstairs….

LOST without a f***ing PC

I don’t know if I can wait for the new GPU’s.

I might actually have to talk to my wife?
 
no one makes new crysis game no more. it is questionable now badly needed a 3070ti not saying 3080ti will actually be. well, if one plays at 4K of course, but, all my monitors are 1440 and will remain so. so...
I was going to joke about this. Then I saw they recently announced the 4th installment...
 
God no. Got it all disconnected, opened it up, made sure everything was connected internally correctly, went to garage to air compress all the pubes collected inside, reconnected….. and still nothing.

Bad GPU wouldn’t explain zero power to mouse and keyboard.
bring it over and I'll look at it.
 
I can't even do that I'll hit the water table before it's deep enough [rofl]


Actually, can’t either because I’m on granite ledge.

I can’t even dig my own f***ing grave on my own property in this dumpster state.
 
Back
Top Bottom