Software Optimization

CraigCole

New member
Nov 5, 2015
193
0
0
Visit site
When it comes to computers I?m pretty much a lay person. I know the basics of ram, processor cores and can mess around with the Windows Registry if given exact instructions but that?s about it. I know nothing about building apps or software code.

As a general consumer, I own a Lumia 950 and was curious about the performance oddities of W10M. Why, with a six-core processor and THREE GIGS of ram does it stutter, lag and pop up loading screens? With such prodigious resources shouldn?t it fly through every task you throw at it?

But it doesn?t. Hopefully you computer experts on the forum will chime in. Is W10M just that inefficient? I?ve heard it?s not optimized for the hardware but what exactly does that mean? Also, why would Microsoft release software in such a poor state that doesn?t even run very well on an ostensibly flagship-level device of its own making? How long does it take to optimize for the hardware and why haven?t they done this yet, a year after the device was released? I realize some of these questions are unanswerable but nonetheless I feel they?re still worth asking.

Since I?m a nostalgic sort I dug out my old Samsung Focus out the other day. It?s one of the original WP7 devices. With dramatically less powerful hardware it boots in about half the time of my 950 and is generally snappier (though, of course, it does a lot less). In fact, even after all these years the maps app is smoother and faster than the one on my Lumia; I?m not kidding. It consistently loaded things faster and was smoother to use, which blew me away since it makes do with a 1 GHz processor and 512 MB of ram. I guess they were indeed the good old days.

Thanks!
 

xandros9

Active member
Nov 12, 2012
16,107
0
36
Visit site
It's no secret that Windows 10 Mobile is quirkier and less smooth than its predecessors. Microsoft has laid off much of its QA engineers as well as put W10M on the backburner, and it shows.
 

BackToTheFuture

New member
Aug 9, 2013
44
0
0
Visit site
It's no secret that Windows 10 Mobile is quirkier and less smooth than its predecessors. Microsoft has laid off much of its QA engineers as well as put W10M on the backburner, and it shows.

No, you are wrong. This got nothing to do with QA or layoff. The reason is simply W10M is now much more complex compared to WP7, and WP8. WP7 was built on WinCE kernel, which was designed for embedded system with very limited resources. As a result, the subsystems are simpler (lack of abstraction layers) and faster, making apps work faster as well. But, it lacks the scalibility, hard to expand the OS. Plus, MS went all-in to optimize for WP7, knowing the limited hardware the OS runs on. This still held true for WP8, they switched to NT kernel similar to regular windows, but still with compact subsystems. Lack of abstraction layers makes adding features more difficult.

This changes with W10M, they made W10M very similar to the regular windows, more scalable. The more complex the subsystems are, more function calls must be made to perform a same task, which takes more time. Not only that, now they prioritize building features first, optimize later - you can see a significant speed-up after AU. Complex architecture makes bug-fixing more difficult as well.

As a software engineer/computer scientist, I'm quite happy with what MS is doing. For now, please give them time to improve the OS.
 

a5cent

New member
Nov 3, 2011
6,622
0
0
Visit site
^ Glad I'm not the only person explaining this around here anymore. Unfortunately, none of this is intuitively understood without having at least a little bit of a background in software engineering (I'm guessing you lost most people somewhere around "abstraction layer"). 😀
 

EspHack

New member
Jun 11, 2013
1,279
0
0
Visit site
@BackToTheFuture's explanation is the fundamental reason here, but to say a lack of proper QA teams and testing means nothing is taking it a bit too far, modern microsoft software in general is just not as reliable as it used to be, they are building faster than they can refine it, see on pc you have a choice, avoid "modern" apps and everything is business as usual, but step into "apps" and you are likely to see crashes and random nonsense impeding your work

on mobile you don't have neither a choice nor an overpowered machine to compensate, thats why user complains reaches a peak there

also I remember articles talking about microsoft focusing heavily on WP7-8's optimizations to get it near perfection, and every reviewer was fully aware of that while testing the end result, none of that is true today, if a certain amount of "insiders" approve, then ship it
 

CraigCole

New member
Nov 5, 2015
193
0
0
Visit site
^ Glad I'm not the only person explaining this around here anymore. Unfortunately, none of this is intuitively understood without having at least a little bit of a background in software engineering (I'm guessing you lost most people somewhere around "abstraction layer"). ��

Yeah, I read the first sentence about abstraction layers on Wikipedia and gave up! :confused: It's as over my head as the International Space Station.

But anyway, I guess it's safe to say that at its very foundation W10M is dramatically more complicated than WP7/8/8.1, which causes it to bog down even top-spec hardware. Speed, efficiency and elegance were sacrificed for greater expandability and a shared core with desktop Windows. The benefits of this change have yet to be seen... all I know is that my Lumia 950 is buggy, stuttery and frustratingly unreliable, exactly the opposite of what Panos Panay described when he stood on stage this time last year.
 

BackToTheFuture

New member
Aug 9, 2013
44
0
0
Visit site
The benefits of this change have yet to be seen... all I know is that my Lumia 950 is buggy, stuttery and frustratingly unreliable, exactly the opposite of what Panos Panay described when he stood on stage this time last year.

2 benefits are apparent: faster development of the OS, and easy expansion in a "plug-in" fashion.

As for bugs, well, they got both my sympathies and complaints. Every "big" software has bugs, it's fact. But if the developers really put their mind where it should be, they should strive for software perfection. Nevertheless, the quality will improve over time - they got paid to do it after all. I have observed the improvement after AU on my 920 and 1520, compared to the initial release of W10. I do not have the 950 hence can't comment on it.
 

kaktus1389

New member
Feb 7, 2016
793
0
0
Visit site
all I know is that my Lumia 950 is buggy, stuttery and frustratingly unreliable, exactly the opposite of what Panos Panay described when he stood on stage this time last year
When does it stutter? Is it randomly or when you're doing something particular? I have 0 problems on my 950, latest firmware and latest RS1 Build.
 

techiez

Member
Nov 3, 2012
832
0
16
Visit site
2 benefits are apparent: faster development of the OS, and easy expansion in a "plug-in" fashion.

As for bugs, well, they got both my sympathies and complaints. Every "big" software has bugs, it's fact. But if the developers really put their mind where it should be, they should strive for software perfection. Nevertheless, the quality will improve over time - they got paid to do it after all. I have observed the improvement after AU on my 920 and 1520, compared to the initial release of W10. I do not have the 950 hence can't comment on it.

Indeed software development is complex and every big software has bugs, but you dont release a half baked software to production builds, The initial W10M production versions were horrible, which is inexcusable and I believe they are relying on insiders for their testing as they have bare minimum resources allocated to mobile.
 

BackToTheFuture

New member
Aug 9, 2013
44
0
0
Visit site
Indeed software development is complex and every big software has bugs, but you dont release a half baked software to production builds, The initial W10M production versions were horrible, which is inexcusable and I believe they are relying on insiders for their testing as they have bare minimum resources allocated to mobile.

They released the software prematurely because of demand. Imagine if they only released W10M this year, the world would go mad.
And no, no company relies on outsiders to test their software. MS has plenty of testers at hand. They release the previews to collect OPINIONS and SUGGESTIONs and irregular bug report.
 

xandros9

Active member
Nov 12, 2012
16,107
0
36
Visit site
And no, no company relies on outsiders to test their software. MS has plenty of testers at hand. They release the previews to collect OPINIONS and SUGGESTIONs and irregular bug report.

Well, Microsoft does now. They laid off a non-trivial number of QA engineers and offloaded the work to the developers as well as the Insider program. They're saving money this way, so of course they're doing it.

Since then, Windows 10 has always been noticeably quirkier to me on all devices I've seen it on. On PC's at work I don't use it enough to tell, but the family desktop, my laptop, three Lumia's and my dad's tablet all got on Windows 10 (both at launch and the Anniversary update) and its had more bugs than I've seen historically. My laptop is now back on 8.1 and the family desktop's still on 10, but at least the start menu isn't broken anymore.

Of course, they're okay for normal use for most people, but I'm rather intolerant of unexpected events on my gadgets.

This article focuses on the recent webcam issue, but it does touch upon the QA issue I'm trying to get at. Windows 10 Anniversary Update breaks most webcams | Ars Technica
 

a5cent

New member
Nov 3, 2011
6,622
0
0
Visit site
@BackToTheFuture's explanation is the fundamental reason here, but to say a lack of proper QA teams and testing means nothing is taking it a bit too far
Xandros mentioned quirks. I think that's an accepted synonym for bugs and other imperfections, in which case you and Xandros are right. A strongly downsized QA team will impact such things negatively.

However, that's not what the OP was asking about. His query was in regard to "software optimizations" and "performance oddities". In that regard BackToTheFuture is absolutely correct. The size of the QA team is pretty much irrelevant. In that regard the QA team means practically nothing.

Bugs/quirks and performance characteristics are two different and (for the most part) unrelated aspects of a software system.

Anyway, to go a bit off topic... I'm not sure a downsized QA team can really be blameed for the quirks. I know people in the QA team were let go, but how many? AFAIK the overwhelming majority of layoffs targeted ex-Nokia staff, and Nokia was never involved in QA for the OS. If someone has more information to share on that I'd certainly be interested. Just like BackToTheFuture, I'm not willing to believe that MS has no QA staff or that MS relies primarily on insiders for testing. The chances of that being true are pretty much zero.

Consider also that a lot of the "quirks" were so darn obvious that it's impossible MS wasn't aware of them. MS would have been aware of them even if they had no QA team whatsoever. In my experience, it's far more likely that MS decided to adhere to a very tight shipping schedule. Sometimes, shipping what you have now is more important than perfecting it first. That would also explain why W10M was the first update that didn't just arrive and install itself automatically. No. MS knew perfectly well that W10M was bug ridden and quirky. They just decided to ship regardless.
 
Last edited:

CraigCole

New member
Nov 5, 2015
193
0
0
Visit site
Since then, Windows 10 has always been noticeably quirkier to me on all devices I've seen it on. On PC's at work I don't use it enough to tell, but the family desktop, my laptop, three Lumia's and my dad's tablet all got on Windows 10 (both at launch and the Anniversary update) and its had more bugs than I've seen historically. My laptop is now back on 8.1 and the family desktop's still on 10, but at least the start menu isn't broken anymore.[/URL]

You're spot on here. My Surface Pro 4 has all kinds of little bugs, you never really know if it's going to work properly when you need it to. Even today, with all the latest updates, there are myriad little annoyances. For instance, Cortana only responds to reminder queries properly about 50 percent of the time, I took a web note the other day in Edge and when I viewed it in OneNote none of my scribbles lined up where they were on the page, the Type Cover kicks back some weird USB-connection error about once a week (aside from randomly disconnecting and reconnecting in rapid succession about twice a day), universal Windows apps are annoyingly unstable, and, hell, there's still a problem with taskbar highlights getting stuck, an unaddressed issue since Windows 7, but you get the idea. Back to W10M...

Consider also that a lot of the "quirks" were so darn obvious that it's impossible MS wasn't aware of them. MS would have been aware of them even if they had no QA team whatsoever. In my experience, it's far more likely that MS decided to adhere to an unrealistic shipping schedule. Sometimes, shipping what you have now is more important than perfecting it first. That would also explain why W10M was the first update that didn't just arrive and install itself automatically. No. MS knew perfectly well that W10M was bug ridden and quirky. They just decided to ship regardless.

And you know something (of course, lots :winktongue:)? I almost wish they hadn't shipped W10M until now. Last autumn I would have just replaced my 1020, which didn't survive a drop very well, with a 1520 or similar Lumia. Instead, I opted to get the then brand-new 950, which I perhaps foolishly expected to be every bit as reliable, smooth and responsive as all the other Windows Phones I'd had. And with a six-core processor, three gigs of ram and 3,000 mAh battery on board why shouldn't perform with the same speedy slickness? This performance disparity is what prompted my initial question in this thread.

Knowing what I know now, I'd have gladly waited until now for a real flagship-quality device to ship, one with suitably stable software and much more attractive design. As it stands, it's doubtful I'll get another Microsoft/W10M handset in the future; that's how displeased I am with the overall experience owning a 950. But as always, hindsight is 20/20; the company obviously determined it was time to ship less-than-robust software at this point last year instead of waiting for it to mature a little more.
 
Last edited:

BackToTheFuture

New member
Aug 9, 2013
44
0
0
Visit site
Well, Microsoft does now. They laid off a non-trivial number of QA engineers and offloaded the work to the developers as well as the Insider program. They're saving money this way, so of course they're doing it.

This article focuses on the recent webcam issue, but it does touch upon the QA issue I'm trying to get at. Windows 10 Anniversary Update breaks most webcams | Ars Technica

Software testing is an inseparable part of software development. MS has been a software powerhouse for more than 30 years, they know it better than almost anyone here. Saying they slacking off on testing is nonsensical, not to mention Windows is their most important product (not the most profit generating tool, however). If anything, they are forced to ship to meet schedule even though some bugs remain, but do not affect normal operations (official release of course, not count preview builds).

The webcam problem is not a bug. MS changed the video capture driver model, and HD webcams transmitting compressed data have their drivers broken - NOT every webcam. The manufacturers need to update drivers to adapt to the new model.

@a5cent has repeated it twice already, but I will say it once more: testing has nothing to do with software optimization or efficiency. And give them time to improve the OS. I will tell you one little truth: there are normally only one or two programmers working on 1 module (Windows has thousands of them), plus one or two testers. More people do not speed up the process.
 
Last edited:

CraigCole

New member
Nov 5, 2015
193
0
0
Visit site
@a5cent has repeated it twice already, but I will say it once more: testing has nothing to do with software optimization or efficiency. And give them time to improve the OS. I will tell you one little truth: there are normally only one or two programmers working on 1 module (Windows has thousands of them), plus one or two testers. More people do not speed up the process.

So, here are a couple questions for you and a5cent. In practical terms, when software is optimized what does that mean? How or what is changed? Is the code just refined and made more efficient? These are probably idiotic questions but I figured I'd ask anyway :amaze:!
 

BackToTheFuture

New member
Aug 9, 2013
44
0
0
Visit site
So, here are a couple questions for you and a5cent. In practical terms, when software is optimized what does that mean? How or what is changed? Is the code just refined and made more efficient? These are probably idiotic questions but I figured I'd ask anyway :amaze:!

Algorithmically, the software is optimized to be more efficient: either use less memory or has better running time, or both. The code (or algorithm) must be changed (redesigned, modified, rewritten) in order to make it efficient. There's optimization within single module, or across modules.

Hope it help!
 

a5cent

New member
Nov 3, 2011
6,622
0
0
Visit site
So, here are a couple questions for you and a5cent. In practical terms, when software is optimized what does that mean? How or what is changed? Is the code just refined and made more efficient? These are probably idiotic questions but I figured I'd ask anyway :amaze:!

In a nutshell, software optimization can and will mean very different things depending on what you want to optimize. For example, if you want to optimize memory consumption, you'd take into consideration and work on very different aspects of a software system than, say, if you were attempting to tailor a software algorithm to the capabilities of a specific graphics card. Both of those tasks are again completely different from the challenges a software team faces when trying to optimize battery life. Of course we also have the classical problem of optimizing performance, which means we aim to dramatically reduce the number of CPU instructions that must be executed to achieve a certain state or compute a certain result. The reduction of CPU instructions must be dramatic, because reducing the number of CPU instructions by a thousand will not save us more than a nanosecond (literally), which humans just aren't equipped to notice.

To make things manageable, let's ignore everything except performance optimization:

Consider for a moment a graphics card. If you have a display with a resolution of 2560x1440 which is refreshed at 60Hz, then your graphics card is pushing over 221 million pixels to your monitor per second. That is why developers who work on graphics engines usually take great care to optimize raster operations, as any performance loss or gain will be multiplied by millions. It's in places like this where developers will invest a lot of effort into optimizing individual algorithms and individual lines of code. This isn't a very typical software engineering environment however.

In most software systems, it's more typical to achieve performance gains my modifying the software design. Consider an online store, which applies rebates to the total cost of items based on the quantity ordered. Say you changed the quantity of an item in your shopping cart. The web page you downloaded to your browser may then relay that change back to the server, which would do the cost calculations, and return the results to your browser where the results would be displayed. If that online store is on the other side of the world, that might take a second or two. The web developer might then decide to include a numerical description of the store's rebate policy as part of the web page, along with some JavaScript that can do the entire cost calculation in the browser. By doing so the developer "optimized away" the roundtrip between web browser and server (for that one particular task). Overall, a very similar amount of computational work is being done. However, the user is still getting the results quicker. This is the most trivial example I could think of. Its purpose is to demonstrate what it means to identify and remove hardware bottlenecks. In this example the hardware bottleneck was the network, but pretty much every piece of hardware that is part of (or somehow connected to) your computer can potentially become a bottleneck.

With that I described two fundamentally different ways in which a developer might improve performance in a software system. One involves optimizing low level algorithms and the other involves the removal of bottlenecks. I could list a dozen more (i.e. lazy initialization, caching, etc) and we'd still not have touched on memory consumption, battery life, or anything else, but I'll spare us both the trouble of reading and writing all that :)

P.S.
I know we throw the word "optimization" around quite casually in these parts, but I hope I've shown that yours is most definitely not an idiotic question. You might also be interested in what Wikipedia has to say about it.
 
Last edited:

CraigCole

New member
Nov 5, 2015
193
0
0
Visit site
Algorithmically, the software is optimized to be more efficient: either use less memory or has better running time, or both. The code (or algorithm) must be changed (redesigned, modified, rewritten) in order to make it efficient. There's optimization within single module, or across modules.

Hope it help!

In a nutshell, software optimization can and will mean very different things depending on what you want to optimize. For example, if you want to optimize memory consumption, you'd take into consideration and work on very different aspects of a software system than, say, if you were attempting to tailor a software algorithm to the capabilities of a specific graphics card. Both of those tasks are again completely different from the challenges a software team faces when trying to optimize battery life. Of course we also have the classical problem of optimizing performance, which means we aim to dramatically reduce the number of CPU instructions that must be executed to achieve a certain state or compute a certain result. The reduction of CPU instructions must be dramatic, because reducing the number of CPU instructions by a thousand will not save us more than a nanosecond (literally), which humans just aren't equipped to notice.

To make things manageable, let's ignore everything except performance optimization:

Consider for a moment a graphics card. If you have a display with a resolution of 2560x1440 which is refreshed at 60Hz, then your graphics card is pushing over 221 million pixels to your monitor per second. That is why developers who work on graphics engines usually take great care to optimize raster operations, as any performance loss or gain will be multiplied by millions. It's in places like this where developers will invest a lot of effort into optimizing individual algorithms and individual lines of code. This isn't a very typical software engineering environment however.

In most software systems, it's more typical to achieve performance gains my modifying the software design. Consider an online store, which applies rebates to the total cost of items based on the quantity ordered. Say you changed the quantity of an item in your shopping cart. The web page you downloaded to your browser may then relay that change back to the server, which would do the cost calculations, and return the results to your browser where the results would be displayed. If that online store is on the other side of the world, that might take a second or two. The web developer might then decide to include a numerical description of the store's rebate policy as part of the web page, along with some JavaScript that can do the entire cost calculation in the browser. By doing so the developer "optimized away" the roundtrip between web browser and server (for that one particular task). Overall, a very similar amount of computational work is being done. However, the user is still getting the results quicker. This is the most trivial example I could think of. Its purpose is to demonstrate what it means to identify and remove hardware bottlenecks. In this example the hardware bottleneck was the network, but pretty much every piece of hardware that is part of (or somehow connected to) your computer can potentially become a bottleneck.

With that I described two fundamentally different ways in which a developer might improve performance in a software system. One involves optimizing low level algorithms and the other involves the removal of bottlenecks. I could list a dozen more (i.e. lazy initialization, caching, etc) and we'd still not have touched on memory consumption, battery life, or anything else, but I'll spare us both the trouble of reading and writing all that :)

P.S.
I know we throw the word "optimization" around quite casually in these parts, but I hope I've shown that yours is most definitely not an idiotic question. You might also be interested in what Wikipedia has to say about it.

Thanks for the clarification, guys! I'll never be a computer scientist or software engineer but these explanations make a lot of sense, even to simpleton like me. :smile:
 

Members online

Forum statistics

Threads
323,310
Messages
2,243,615
Members
428,056
Latest member
Carnes