It is currently Wed Oct 17, 2018 7:55 pm

All times are UTC - 7 hours





Post new topic Reply to topic  [ 110 posts ]  Go to page Previous  1 ... 4, 5, 6, 7, 8  Next
Author Message
PostPosted: Sun Aug 26, 2018 9:08 am 
Offline

Joined: Sun Sep 19, 2004 11:12 pm
Posts: 20662
Location: NE Indiana, USA (NTSC)
Slashdot user Lonewolf666 is among those planning to switch to GNU/Linux upon Windows 7 end of support.

The joke was that every other Star Trek film was good: just skip the odd ones and watch The Wrath of Khan (II), The Voyage Home (IV), and The Undiscovered Country (VI). Likewise with Windows, there was once a pattern of the odd-numbered versions being usable: 2000 and XP (5.0/5.1), skip Vista (6), use 7, skip 8. The problem is that Microsoft used "But Java would confuse Windows 9 with Windows 98" as an excuse not to release Windows 9 and instead made a second consecutive even numbered version to skip.


Top
 Profile  
 
PostPosted: Sun Aug 26, 2018 10:48 am 
Offline
Formerly Espozo
User avatar

Joined: Mon Sep 15, 2014 4:35 pm
Posts: 3377
Location: Richmond, Virginia
Sounds like a bs excuse on Microsoft's part; I'd think software would look at the number Windows uses internally. I assumed they went to 10 to put extra distance between 8. :lol:


Top
 Profile  
 
PostPosted: Sun Aug 26, 2018 11:00 am 
Offline

Joined: Sun Sep 19, 2004 11:12 pm
Posts: 20662
Location: NE Indiana, USA (NTSC)
Internally, Windows Vista is NT 6.0, Windows 7 is NT 6.1, Windows 8 is NT 6.2, and Windows 8.1 is NT 6.3, because all four use the NT 6 driver ABI. Windows 10 is NT 10.0.

(Star Wars has a different law: the pre-Disney movies make the most sense in the order IV, V, II, III, VI. How can that be tied into operating systems?)


Top
 Profile  
 
PostPosted: Sun Aug 26, 2018 1:11 pm 
Offline
User avatar

Joined: Sun Sep 19, 2004 9:28 pm
Posts: 3634
Location: Mountain View, CA
Drew Sebastino wrote:
Sounds like a bs excuse on Microsoft's part; I'd think software would look at the number Windows uses internally. I assumed they went to 10 to put extra distance between 8. :lol:

Maybe they did, just not in the way you'd expect: https://www.reddit.com/r/technology/com ... 0/ckwq83x/


Top
 Profile  
 
PostPosted: Sun Aug 26, 2018 1:26 pm 
Offline
Formerly WheelInventor

Joined: Thu Apr 14, 2016 2:55 am
Posts: 1783
Location: Gothenburg, Sweden
There's also the aspect of marketing. Half sociology and psychology, half internal market team occultism/gut feeling. If a figure sounds too made up/evened out to be true even if they are (like the height of a mountain or the cost estimate coming for a contractor), they may uneven it to appear more real. Statistics that end on an even digit are thought to be more persuasive than ones that end on an even. Likewise, some numbers in product/service/model names may be avoided because they're percieved as 'weak' in regards to its market. If it sounds irrational, it is because marketing is all about dealing with the irrationality of consumers/customers, as opposed to what modernist thought once assumed.

_________________
http://www.frankengraphics.com - personal NES blog


Top
 Profile  
 
PostPosted: Sun Aug 26, 2018 5:13 pm 
Offline

Joined: Sun Mar 08, 2015 12:23 pm
Posts: 281
Location: Croatia
koitsu wrote:
tokumaru wrote:
Is a reliable OS too much to ask for?

Yes, assuming the maintainers or software developers of the OS and/or its applications are a) young, b) inexperienced (read: do not know why it's a Bad Idea to do Thing X due to lack of history), c) doing "bare minimum" work (i.e. "ship it" mentality), or a combination of any of these. Doesn't matter if the OS is free or commercial -- we're seeing a pretty distinct downfall in the quality of both OSes as well as software applications in general. But answering honestly and personally: no, it's not too much to ask for (and this is why I tend to stick to "older" OSes, "older" programs, "older" things).

I think partially it can be blamed on the huge influx of programmers introduced in the last ~20 years, and partially on the fact that we've lowered the bar by making technology (and programming) "so easy" that nobody bothers to learn things at a lower level nor care to ask "is this a bad idea?" (instead the driving force is "code it/release it, worry about the effects of that later" -- bad mentality). Ask a 20-something programmer if they know anything about Xerox PARC and what was actually going on there in the 70s/80s. Hell, ask them if they know what Doug Engelbart was most famously known for (answer if you don't know). I've been seeing a lot of people "re-discovering" things we already knew or did in the 80s/90s, and when asked "you mean like {thing} from 30 years ago?" they go "huh?" or "that's OLD" and then become defensive. The goal is no longer "understand technology and improve it gradually, really thinking deeply about things, and involve people who have more knowledge and historical experience or at least ask them along the way", the goal is "I LIKE TO DO THINGS WITH COMPUTERS" (yes, somehow the statement is the goal), to which I always say "right". It also explains why we have a huge influx of awful programming languages in the past 20 years -- they're less about introducing real improvements, but more about trying to reinvent the wheel because they didn't like how {other PL} did something, all while pushing wasteful things like excess abstraction (often being "make all the things objects!"). Get guys like Luca Cardelli involved, not Rob Pike or Guido van Rossum. Anyway...

When it comes to Windows, it's sad, because the kernel itself continues to get better every OS release. It's all the "other crap" that makes Windows what it that's getting worse. Windows would run just fine without the "Microsoft Store" or some kind of weird apps. Some Linux distros (ex. Ubuntu) are beginning to do the same thing, shoving more and more "junk" into the system by default rather than just keeping the OS (and the GUI) bare/simple and letting people essentially own/use what is designed.

I totally understand this. Like, how do you expect those "I LIKE TO DO THINGS WITH COMPUTERS" kind of people to be able to code stuff when learning C is so hard? The first hello world program seems very hard because if you have scanf("%d",&mynumber);, many people will not type that "&" and the program will crash without them knowing what happened. Then you gotta learn pointers and dynamic memory allocation and all that crap. By the time you learn C++ till the end with all those operators and constructors and libraries and everything and how scoping works, you've already lost all the inspiration of doing creative things! Which is what happened with me. I loved QBASIC4.5 so much, but when I was faced with C/C++, I had a bad time. Luckily, I learned C so good that it takes me very little time to recall important things before I go and code. But C++ is really hard! I mean, having to remember the execution order of stream operators and when an object is allocated and when it's deallocated, urgh... But then there's this C# and Java and other crap with this stupidest invention in programming ever: garbage collection. I mean seriously! Why don't you let ME deallocate an object when I'm not using it? Or if this garbage collection is so smart, then why doesn't it do garbage collection upon every single reference nullifying? So then when I see how all this is messed up, I get completely into thinking about building a computer from pneumatic/hydraulic circuits with my own CPU architecture, programming language, etc. and starting the whole IT infrastructure from scratch!

But anyways, now that there's so many updates for everything, I kinda feel sad. There's no longer that good feeling where you know where something is because everything keeps changing with every update. The updates intrude your computer, break things, steal your private information, etc.. I mean, why can't things be so good like when it was Windows XP? Some zealous Linux guy says XP is a crap OS because everyone was abusing the kernel's security holes and that Vista was actually awesome but spitted on because it patched all security holes which made all programs stop working. I mean, why complain about that!? Windows XP worked! What it promised, it fulfilled! You couldn't ask for more. And if you were smart enough, you could have workarounds to make something work like how you want. But today, the very things that are promised to us like just watching a video or seeing a picture or opening a stupid document is already so hard! Word keeps doing some autocorrection stuff that I cannot turn off because all of the instructions on how to do that are outdated and the checkbox I'm looking for isn't shown in MY version of Word. The Windows 10 Photo app takes so long to load! Look at Windows XP! It loaded instantly! And what was so wrong with sndrec32.exe? It recorded stuff well! I mean, yeah, you could only record 60 seconds, but then Audacity came to fix it. What did Windows 7/10 do? They have a minimalistic program where you just hit record and stop. WHAT WERE THEY THINKING? I know. They weren't thinking! They didn't have anything to think with!

The main reason I loved computers as a kid is that they were so predictable. You press a button and it does what the button says. Today, nothing is predictable! How are our children going to know how awesome computers are when everything is breaking? Of course, unless they have some super expensive computers and if they sign the EULA to have all their privacy taken until one day this happens:
Someone: *knocks on the door* we came for the kidney.
We: What kidney?
Someone: The one you promised to us when you clicked Next, Next and I agree.

Seriously! When will politics and economy get the fuck out of computers!? I've had enough of politics in real life so I'm using computers AND NOW THEY'RE INVADING COMPUTERS! I seriously don't know what to do at this point. Maybe make some FPGA computer with its own CPU architecture and operating system that has a working C compiler and that it's self-sustainable and able to flash itself onto new FPGA chips.


Top
 Profile  
 
PostPosted: Sun Aug 26, 2018 5:45 pm 
Offline

Joined: Sun Sep 19, 2004 11:12 pm
Posts: 20662
Location: NE Indiana, USA (NTSC)
8bitMicroGuy wrote:
But then there's this C# and Java and other crap with this stupidest invention in programming ever: garbage collection. I mean seriously! Why don't you let ME deallocate an object when I'm not using it? Or if this garbage collection is so smart, then why doesn't it do garbage collection upon every single reference nullifying?

CPython, the reference interpretation of the Python programming language, does exactly that using reference counting. But it can slow things down because of all the locking involved when multiple threads create and nullify references to objects. (This is why CPython has a global interpreter lock, preferring processes over threads for use of multiple cores.) And because Python implementations on top of JVM and CLR don't use reference counting, it's still good style to use finally or with to deallocate things as one would in Java and C#.

std::shared_ptr in C++ also uses reference counting.


Top
 Profile  
 
PostPosted: Mon Aug 27, 2018 7:27 am 
Offline
User avatar

Joined: Fri Oct 08, 2010 6:08 pm
Posts: 110
Location: NY, USA
Banshaku wrote:
Is that so? Great! I will be more than happy to change it once I know how. I will be waiting for the link then. thanks!


Here it is: https://www.cnet.com/how-to/how-to-get- ... indows-10/


Top
 Profile  
 
PostPosted: Mon Aug 27, 2018 7:56 am 
Offline

Joined: Tue Feb 07, 2017 2:03 am
Posts: 613
8bitMicroGuy wrote:
I loved QBASIC4.5 so much, but when I was faced with C/C++, I had a bad time. ...snip.. But then there's this C# and Java and other crap with this stupidest invention in programming ever: garbage collection. I mean seriously! Why don't you let ME deallocate an object when I'm not using it? Or if this garbage collection is so smart, then why doesn't it do garbage collection upon every single reference nullifying?

You do realise BASIC and QBASIC have Garbage Collection in them right?

8bitMicroGuy wrote:
Seriously! When will politics and economy get the fuck out of computers!? I've had enough of politics in real life so I'm using computers AND NOW THEY'RE INVADING COMPUTERS! I seriously don't know what to do at this point. Maybe make some FPGA computer with its own CPU architecture and operating system that has a working C compiler and that it's self-sustainable and able to flash itself onto new FPGA chips.
Good news, the RISC-V is now coming back and you will be able to build you own computer with everything opensource from the ground up ;) https://www.youtube.com/watch?v=L8jqGOgCy5M


Top
 Profile  
 
PostPosted: Mon Aug 27, 2018 11:35 am 
Offline
User avatar

Joined: Fri Nov 19, 2004 7:35 pm
Posts: 4093
Reference counting is not garbage collection?

_________________
Here come the fortune cookies! Here come the fortune cookies! They're wearing paper hats!


Top
 Profile  
 
PostPosted: Mon Aug 27, 2018 12:11 pm 
Offline

Joined: Sun Mar 27, 2011 10:49 am
Posts: 265
Location: Seattle
8bitMicroGuy wrote:
But then there's this C# and Java and other crap with this stupidest invention in programming ever: garbage collection.


Garbage collection was a phenomenal idea and I feel almost a little personally offended by how you're trashing the work of the hoards of very, very smart people who invented and have spent countless hours developing and refining garbage collectors, which are often beautiful pieces of software.

8bitMicroGuy wrote:
I mean seriously! Why don't you let ME deallocate an object when I'm not using it? Or if this garbage collection is so smart, then why doesn't it do garbage collection upon every single reference nullifying?


Why not make the programmer responsible for deallocating objects? Because they'll forget to and leak memory, or because they'll deallocate the object and then accidentally use it again, or because code that manually manages memory is just busywork that often obfuscates the logic of the program itself, or because constantly malloc()'ing and free()'ing memory often introduces heap fragmentation and performance issues as compared to a smart garbage collector (e.g. a GC'd language can allocate just by bumping a pointer in many schemes, whereas malloc()'ing requires searching and maintaining a free list), or because contriving your program to ensure that you can easily determine when an object is no longer in use makes code less readable or maintainable or performant due to over-copying and is error-prone or just involves an ad-hoc reimplementation of reference counting/GC or...the issues with manual memory management and the impact it's had on software quality over the years is well-documented and undeniable.

8bitMicroGuy wrote:
Or if this garbage collection is so smart, then why doesn't it do garbage collection upon every single reference nullifying?


Well you could mean a couple of things here; either you're advocating running GC every time a reference is lost (which would be unnecessarily slow) or are suggesting something like a reference counter, which is viable and used often but does have well-known disadvantages (can't handle reference cycles, wastes space, slow). If you have a better solution for automatic memory management, both research and industry would love to hear about it.

Garbage collection isn't suitable for all use cases, but I absolutely can't imagine you've worked on any serious software and could still claim that it's "the stupidest invention in programming ever", even as hyperbole. It's seriously one of the smartest inventions in programming ever.


Top
 Profile  
 
PostPosted: Mon Aug 27, 2018 1:28 pm 
Offline

Joined: Sun Mar 08, 2015 12:23 pm
Posts: 281
Location: Croatia
Alright. I've probably overlooked garbage collectors. It's because the stuff I was making was using manual memory management and I was good at it. When I had a college project where I had to make a text-to-speech translator where I'd write something in Croatian and it would rewrite it to sound like English in the way how Microsoft Sam could understand, I was using C# or C++ with garbage collector because CLR nonsense. I had a hard time figuring out how to delete unused resources. Usually I'd just say delete(something);, but here it didn't work. I had such a hard time with that. But I guess maybe it was because I didn't know how to use it well. I'm sorry about that. Also, my opinion of garbage collectors was influenced by people who didn't know how to use it (AND WHO CRASHED MY PC BY DOING SO!) and by my mentor who I looked upon because he was my only source of knowledge about C/C++.


Top
 Profile  
 
PostPosted: Mon Aug 27, 2018 3:37 pm 
Offline
User avatar

Joined: Sun Sep 19, 2004 9:28 pm
Posts: 3634
Location: Mountain View, CA
8bitMicroGuy wrote:
Alright. I've probably overlooked garbage collectors. It's because the stuff I was making was using manual memory management and I was good at it. When I had a college project where I had to make a text-to-speech translator where I'd write something in Croatian and it would rewrite it to sound like English in the way how Microsoft Sam could understand, I was using C# or C++ with garbage collector because CLR nonsense. I had a hard time figuring out how to delete unused resources. Usually I'd just say delete(something);, but here it didn't work. I had such a hard time with that. But I guess maybe it was because I didn't know how to use it well. I'm sorry about that. Also, my opinion of garbage collectors was influenced by people who didn't know how to use it (AND WHO CRASHED MY PC BY DOING SO!) and by my mentor who I looked upon because he was my only source of knowledge about C/C++.

For what it's worth, I have similar views as you when it comes to garbage collection and other "automated-by-the-PL" features. My viewpoint is limited in scope, however -- that is to say, I don't program professionally (it's part of my job but it isn't my sole job), and software I have written/do write is fairly "linear" (read: do not need things like threading) and in a limited number of PLs (mainly C and Perl, but things like PHP). And like most folks here, I also work on classic archs in lower-level PLs (i.e. 65xxx assembly).

I feel because of "what" I write/do, and the platforms I work on, my viewpoint is fairly skewed towards the classic model of memory management you describe: free() what you malloc/calloc(), else allocate on the heap (and be wise/smart about when to do this). Thus, today, it's hard for me to really judge GC-centric languages because I myself have an already limited view given what my needs are/what I do. It means I'm biased against a thing because I have no immediate need for the thing... but just because I don't need something doesn't mean it doesn't have a good purpose; It doesn't mean that GC/refcounting is bad, it just means when I hear of it being this amazing incredible all-encompassing magic smoke, I understandably roll my eyes. I'm absolutely certain there are folks with the exact opposite view and I think they're effectively just as right as I am. I hope that makes sense to readers.

That said, one of the problems that does continue to plague software today -- specifically newer/younger PLs (ex. Go, Rust, maybe Python) -- is that their GC and ref-counting implementations aren't as solid as, say, what C++, C#, or even Java has. This is further amplified by the fact a lot of developers don't have any idea how to even begin tracking down performance problems that stem from the GC kicking in heavily (i.e. happening without their knowledge/control/consent). This in turn causes trade-offs being made in design/whatever (by the programmer of the application) to try and work around whatever the problem is, and these trade-offs are often perverse or bizarre.

So yes, while I am of the "old guard" mentality of "why bother with GC, free what you allocate", it doesn't mean this approach works for everyone or every situation. I try not to impose that on others too much, but I still have my opinion -- while simultaneously (in somewhat of a juxtaposition) keeping an open mind.


Top
 Profile  
 
PostPosted: Mon Aug 27, 2018 5:16 pm 
Online

Joined: Sun Mar 27, 2016 7:56 pm
Posts: 167
koitsu wrote:
That said, one of the problems that does continue to plague software today -- specifically newer/younger PLs (ex. Go, Rust, maybe Python) -- is that their GC and ref-counting implementations aren't as solid as, say, what C++, C#, or even Java has.

Just to avoid any misunderstanding, Rust does not use garbage collection; its memory safety comes from static analysis at compile-time. As for ref-counting, I don't know how Rust's Rc<T> type compares against other languages, but when it frees the data is predictable (when the last Rc<T> pointing to the data goes out of scope).


Top
 Profile  
 
PostPosted: Mon Aug 27, 2018 5:30 pm 
Offline

Joined: Sun Sep 19, 2004 11:12 pm
Posts: 20662
Location: NE Indiana, USA (NTSC)
Memory management discussion continues


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 110 posts ]  Go to page Previous  1 ... 4, 5, 6, 7, 8  Next

All times are UTC - 7 hours


Who is online

Users browsing this forum: Google [Bot], Google Adsense [Bot], Nicole and 3 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB® Forum Software © phpBB Group