What's up with the 1 GB RAM requirement ?

Status
Not open for further replies.

OzRob

New member
Mar 20, 2013
604
0
0
Visit site
When Google says it will kill EAS then says it will kill CalDAV and then later that it will kill EAS but not now or it will kill calDAV but you can white list it makes sense to move forward and not accept to be blackmailed or to have everything shutoff on a wimp by google.

Ummm...you do know that calendar syncing was shut off a few days ago on a whim (or even a wimp) by Microsoft, don't you? And they didn't even tell anyone that updating the core apps would have this effect. Like I said, Google-Microsoft, Microsoft-Google - they are both as bad as each other at the moment.
 

ChMar

New member
Mar 15, 2013
273
0
0
Visit site
Ummm...you do know that calendar syncing was shut off a few days ago on a whim (or even a wimp) by Microsoft, don't you? And they didn't even tell anyone that updating the core apps would have this effect. Like I said, Google-Microsoft, Microsoft-Google - they are both as bad as each other at the moment.

I will no accept to be silenced with a stick. An no one should so I can understand were MS comes from. You can of course use any google service you want from the browser. This was not started by MS. I didn't say that Google or Microsoft are not both corporations or that one is better than the other. It's just that I understand their position.
 

Mirtas

New member
Mar 14, 2013
50
0
0
Visit site
I believe this fall with "Blue" WP9 new hardware specs! "Above Android Galaxy S4!"

- Quad-Core Snapdragon 800 series (MSM8974) Adreno 330
- 2GB RAM
- 5" 1080p (441PPI) displays
- 3000mAh battery
- Nokia Lumia 940
- Samsung ATIV S2

1GB RAM (MSM8960) WP8 will become low-end Windows phone this fall. ��

I think that they are going to choose the Snapdragon 600, because the 800 is mostly for tablets. If LG can put a 2460 mAh battery in a 4.3 inch screen device (P710) that is smaller and a tiny bit thinner than the Lumia 820 (2460 mAh battery) surely we can expect the next Lumia 9x0 to have 2200+ mAh. But it is not always about mAh, iPhone 5 only has a 1440 mAh battery and does a great job with it.

Based on the past, the problem is that Windows Phones devices are a few steps behind in hardware and power efficiency. By the time a 2GB Quad-Core Windows Phone hits the market (I expect November 2013) it will be falling behind again. Although it should not be about having the latest hardware, but giving the best performance on the hardware you have. This does not hold up when you can't play the latest games or in terms of marketing. The average consumer sees 8 cores as better than 2 cores, 2GB ram better than 1GB ram and 13MP camera better than 8MP.

Windows Phone needs time and I am positive that within 18 months they will catch up and we will see 15% marketshare. And this will force developers to be more efficient with their programming and will also force manufactures to provide the best hardware. And there will be more competition within the WP world. Developers will compute for your money and there will be more manufactures. I am positive that once there is money to be made in the WP world, brands like LG and Sony will join.
 

a5cent

New member
Nov 3, 2011
6,622
0
0
Visit site
Microsoft has (unfortunately) never chosen the top-of-the-line chipset for Windows Phone.
  • 2010: chose S1 over S2
  • 2011: chose S2 over S3
  • 2012: chose S4 over S4 Pro
  • 2013: will likely choose 600 over 800

Qualcom introduced the segmentation in S1, S2, S3 and S4 lines after the fact, and in a very arbitrary fashion. While your list isn't wrong, it doesn't really explain the situation:

From 2010 to late 2012 (WP7):
All devices were built using the exact same CPU. Clock rates and the GPU were the only differences between SoCs certified for WP7.

Qualcom just decided to screw around with labels... if an S1 was clocked above 1GHz they called it an S2... nothing but marketing...

Neither a "real" S2 nor an S3 ever made it onto a WP7 device.

From late 2012 to now (WP8):
All devices were built using the S4, Plus AND Pro versions. The MSM8960T in the Chinese L920 is an S4Pro! And again... clock rates and the GPU are the only differences between the SoCs certified for WP8.

The point is, it was never about purposely choosing "only" the second most powerful SoC... for whatever reason. It was, and still is, about limiting hardware variability with the goal of limiting fragmentation.
 

a5cent

New member
Nov 3, 2011
6,622
0
0
Visit site
Yes chipset must be certified. But this only means OEMs put up a request for a chipset to go through certifications.

While I completely agree with all of your reasoning, AFAIK this first part is incorrect.

MS employees have told me, more than once, that MS makes these evaluations on their own. I did not specifically ask them if OEMs can file requests, but I'm sure that would have been mentioned if that was part of the process.
 

ChMar

New member
Mar 15, 2013
273
0
0
Visit site
Qualcom introduced the segmentation in S1, S2, S3 and S4 lines after the fact, and in a very arbitrary fashion. While your list isn't wrong, it doesn't really explain the situation:

From 2010 to late 2012 (WP7):
All devices were built using the exact same CPU. Clock rates and the GPU were the only differences between SoCs certified for WP7.

Qualcom just decided to screw around with labels... if an S1 was clocked above 1GHz they called it an S2... nothing but marketing...

Neither a "real" S2 nor an S3 ever made it onto a WP7 device.

From late 2012 to now (WP8):
All devices were built using the S4, Plus AND Pro versions. The MSM8960T in the Chinese L920 is an S4Pro! And again... clock rates and the GPU are the only differences between the SoCs certified for WP8.

The point is, it was never about purposely choosing "only" the second most powerful SoC... for whatever reason. It was, and still is, about limiting hardware variability with the goal of limiting fragmentation.

I don't believe that there will be much fragmentation if you allow for a greater range of chipsets. For apps this is not a problem and as for games when a game support both 512 and 1gb versions of the platform whatever chipset you throw in there will be supported
 

a5cent

New member
Nov 3, 2011
6,622
0
0
Visit site
I don't believe that there will be much fragmentation if you allow for a greater range of chipsets. For apps this is not a problem and as for games when a game support both 512 and 1gb versions of the platform whatever chipset you throw in there will be supported

On this we will have to agree to disagree. A SoC is not just a pairing of a CPU with a GPU. An OS can hide the differences in hardware only to a point. When it comes to the utilization of media processors, DSPs and any of the other co-processors on a SoC, we are far past that point. Then differences become noticable. If not to the dev, then definitely to the consumer. That is fragmentation.
 

ChMar

New member
Mar 15, 2013
273
0
0
Visit site
On this we will have to agree to disagree. A SoC is not just a pairing of a CPU with a GPU. An OS can hide the differences in hardware only to a point. When it comes to the utilization of media processors, DSPs and any of the other co-processors on a SoC, we are far past that point. Then differences become noticable. If not to the dev, then definitely to the consumer. That is fragmentation.

You are right about the other components in the chipset. But for the differences to become noticeable something must be missing like Bluetooth for example. But that will not pass the certification point because of minimum feature requirements. I don't believe you would be able to notice a different DSP or media processor as end user. There is no specification for how big the resolution must be for a movie or the framerate for video shots. I see no case of fragmentation as user experience is concerned this meaning speed and fluidity for the os(considering the minimum features impose by MS). I don't know if there are encoding requirements from MS so I don't think it's a case of fragmentation here.
 

ChMar

New member
Mar 15, 2013
273
0
0
Visit site
Weekend's over. Did you find any of these 'iPhone 5 exclusive' apps? Because being an owner of an iPhone 5 myself, and a iPhone 4S last year, I haven't found any such app.

:)))) Sure weekend is over. I didn't had time. But if you can't wait for me feel free to ask me anytime you wish. Rest assured I have not forgot
 

Bicpug

New member
Oct 19, 2012
91
0
0
Visit site
I cant see any reason for 1080 on a phone, other than for specification bragging; its just more pixels to push around in games for very little visual benefit.
 

ChMar

New member
Mar 15, 2013
273
0
0
Visit site
I cant see any reason for 1080 on a phone, other than for specification bragging; its just more pixels to push around in games for very little visual benefit.

More PPI. Crisper looking text. And Metro is all about text. So for larger displays it makes sense
 

nessinhaw

New member
Mar 16, 2013
367
0
0
Visit site
More PPI. Crisper looking text. And Metro is all about text. So for larger displays it makes sense

after some research i reached the conclusion that 450ppi is the max for a perfect human eye...very few pplz have this kind of "bionic" eye, in fact, it's rare...so for the majority of pplz, at around 350ppi they can't discern them anymore (Wikipedia says 300ppi is the limit)

based on that, all this 350+ppi screens aren't necessarily any crisper simply because our eyes can't see pixels anymore so we can't rly tell the difference .-.

i think industry makes screens with ppi much above our human limitation as a marketing/selling point - for the average consumer, more numbers means better - this type of consumer doesen't know about the human eye limitation so they immediatly assume 400+ppi as something better when in fact from 350ppi and above they can't differentiate pixels anymore!

i believe the difference will lie on color/brightness/contrast/angles/sunlight leg. since basically considering the majority of population, 350ppi already looks crisp...unless you're one of the few with a "bionic" eye!

edit: i did some more researching and found the max ppi number for a perfect human eye is bigger - i even found as high as 570ppi - but since this is a rare part of the population, i took 350ppi in account!

edit2: plz don't bash me if i said something wrong in this post D: i'm not an expert, that's what i concluded after lots of reading...would be nice someone expert about this to give their opinion!
 
Last edited:

Sanjay Chandra

New member
Mar 2, 2013
487
0
0
Visit site
after some research i reached the conclusion that 450ppi is the max for a perfect human eye...very few pplz have this kind of "bionic" eye, in fact, it's rare...so for the majority of pplz, at around 350ppi they can't discern them anymore (Wikipedia says 300ppi is the limit)

based on that, all this 350+ppi screens aren't necessarily any crisper simply because our eyes can't see pixels anymore so we can't rly tell the difference .-.

i think industry makes screens with ppi much above our human limitation as a marketing/selling point - for the average consumer, more numbers means better - this type of consumer doesen't know about the human eye limitation so they immediatly assume 400+ppi as something better when in fact from 350ppi and above they can't differentiate pixels anymore!

i believe the difference will lie on color/brightness/contrast/angles/sunlight leg. since basically considering the majority of population, 350ppi already looks crisp...unless you're one of the few with a "bionic" eye!

edit: i did some more researching and found the max ppi number for a perfect human eye is bigger - i even found as high as 570ppi - but since this is a rare part of the population, i took 350ppi in account!

edit2: plz don't bash me if i said something wrong in this post D: i'm not an expert, that's what i concluded after lots of reading...would be nice someone expert about this to give their opinion!

PPI needs viewing distance as well.

300 ppi stand for 10-12 inch distance i think (not sure) , according to wiki.

We use laptops with farther distance than tablet and we use a tablet farther distance than a phone (assuming average distances)

So , pixel differentiation depends on viewing distance , eye sight , PPI and Software scaling capability.

For example , Macbook Pro with Retina Display has only 220ppi (or near to that number as far as i know) but OS X scales things and renders them better than iOS/Android , therefore even with lower PPI ,John Gruber stated that , he found 220 ppi Retina Display of macbook pro to be equally appealing as 326/264 ppi in iOS devices
 

a5cent

New member
Nov 3, 2011
6,622
0
0
Visit site
I don't believe you would be able to notice a different DSP or media processor as end user.

There are four such co-processors on every S4 SoC. What you are basically saying, is that Qualcomm is wasting their time designing and integrating them, and that MS is wasting their time utilizing them or exposing specific S4 features to OEMs. You are saying this is all for naught, because users won't notice a difference either way.

I disagree.

I'm going to skip the lengthy explanations detailing why, mainly because I think you'll understand without them. ;-)

BTW... Android does the exact opposite, more along the lines you've suggested. The general purpose computing cores feature much more prominently, largely in the hopes of artificially making the hardware platforms more similar than they really are. This simplifies porting between SoCs, but makes the OS much less efficient on a whole.
 

rbxtreme

New member
Jan 17, 2013
112
0
0
Visit site
:)))) Sure weekend is over. I didn't had time. But if you can't wait for me feel free to ask me anytime you wish. Rest assured I have not forgot

we with our wp8 devices are so so tied up, with the exciting apps that's been pushed onto the store the whole of last week, forget borrowing, we don't even the time to think of the, "what phone was that, again"??? 😊On a lighter note folks
 

nessinhaw

New member
Mar 16, 2013
367
0
0
Visit site
PPI needs viewing distance as well.

300 ppi stand for 10-12 inch distance i think (not sure) , according to wiki.

We use laptops with farther distance than tablet and we use a tablet farther distance than a phone (assuming average distances)

So , pixel differentiation depends on viewing distance , eye sight , PPI and Software scaling capability.

For example , Macbook Pro with Retina Display has only 220ppi (or near to that number as far as i know) but OS X scales things and renders them better than iOS/Android , therefore even with lower PPI ,John Gruber stated that , he found 220 ppi Retina Display of macbook pro to be equally appealing as 326/264 ppi in iOS devices

yes, i didn't even want to get into distances because i was tired and my brain almost went boom from all the scientific reading >.< so i just wanted to make it simple...

in my understanding, higher ppi will only make a difference if we hold our phones closer than 8inches...but for the regular distance we use (10-13 inches) i think more than 400ppi is not needed! industry should focus more on improving colors, brightness, contrast, saturation...afterall ppi is not the only thing that matters in a screen
 

Mirtas

New member
Mar 14, 2013
50
0
0
Visit site
PPI needs viewing distance as well.

300 ppi stand for 10-12 inch distance i think (not sure) , according to wiki.

We use laptops with farther distance than tablet and we use a tablet farther distance than a phone (assuming average distances)

So , pixel differentiation depends on viewing distance , eye sight , PPI and Software scaling capability.

For example , Macbook Pro with Retina Display has only 220ppi (or near to that number as far as i know) but OS X scales things and renders them better than iOS/Android , therefore even with lower PPI ,John Gruber stated that , he found 220 ppi Retina Display of macbook pro to be equally appealing as 326/264 ppi in iOS devices

A 40 inch 1080p TV only has 55 PPI and yet it look great. It is all about distance.

Personally I see a big difference between a 4,3 inch screen with 800x480 (217 PPI) and a 4.5 inch 1280x768 (332 PPI), but seen very little difference between that and 1920x1080 on a 5 inch screen (441 PPI), unless I put is very close to my eyes, which I would never do in real life.

I think a 4.3 inch screen with a 720p is a perfect match, put it in a 125x65x9.5mm phone with a quad-core 1,2 ghz CPU, 1GB ram, 8GB internal + up to 64GB SD, a pureview camera, 1080p video on the back, 480p video on the front, 2200 mAh battery at max 140g and it is my dream phone.

But we are getting a bit offtopic.
 

ChMar

New member
Mar 15, 2013
273
0
0
Visit site
There are four such co-processors on every S4 SoC. What you are basically saying, is that Qualcomm is wasting their time designing and integrating them, and that MS is wasting their time utilizing them or exposing specific S4 features to OEMs. You are saying this is all for naught, because users won't notice a difference either way.

I disagree.

I'm going to skip the lengthy explanations detailing why, mainly because I think you'll understand without them. ;-)

BTW... Android does the exact opposite, more along the lines you've suggested. The general purpose computing cores feature much more prominently, largely in the hopes of artificially making the hardware platforms more similar than they really are. This simplifies porting between SoCs, but makes the OS much less efficient on a whole.

Definitely hw accelerating stuff is not a waist of time. But those day is not the software that demands those power increases. So I don't see how an app can take advantage of such features(not even games actually) not meaning that they will ever have direct access to them(I mean they still need to go through the OS for those). Even hw accelerating media encodings will not get used to full potential. It makes no sense to limit my app on the few supporting SoCs and not have a server to do those and support all devices(not only wp devices but iOS and Android).

We can take all apps in the store and see how many of them do media encoding or anything more than accessing web services. And that is the reason for a broader use of chipsets. Because current software is targeted for casual consumers and does not truly need all that power. You won't even run math intensive stuff in the CPU. Those math stuff you will run only on photos and you will use a shader for that and use the GPU acceleration. And the same can obviously be used for sound too(As a side note I can hardly wait for directcompute to come with the newer GPUs on the phone landscape)

I love the way you see the future I really do and I wish it to be true. But politics and economics does not let it be that way.
 

nessinhaw

New member
Mar 16, 2013
367
0
0
Visit site
I kinda laugh at how it is needed such powerful SoCs for calls, messaging, FB, Instagram, Skype...that's basically what common pplz use their phone for and why i don't invest money on a high-end (in Brazil they're even more expensive because of taxes)...i'm pretty satisfied with my Lumia 620, it's not the most powerful phone but it does all the basic stuff perfectly - all the common stuff we do with our phones! (it's was still a bit expensive tho, damn taxes ;-;)

Many pplz will buy the powerful SGSIV or HTC One for FB/Instagram (as an example), what a waste imo lol

But would this need of more powerful hardware be because Android is demanding more? So far, iOS/WP has been just fine with dual-core processors...it leads me to believe Android is a needy b1tch lol
 
Status
Not open for further replies.

Members online

Forum statistics

Threads
323,197
Messages
2,243,433
Members
428,035
Latest member
jacobss