Two recent reviews of the Lumia 1020 come to sharply different conclusions about the same feature:
Consumer Reports On-Line:
"Video performance was not that impressive, but it's adequate for casual use such as uploading to the Web."
Digital Photography Review Connect:
"The 1020 captures nicely exposed, fluid video in good light. Focus is solid. The image stabilized lens really helps keep the image steady. In low light, the 1020?s big sensor really shines, delivering remarkably ungrainy video with a lot of detail. Focus drift is minimal ? if it became a problem, you could lock it manually."
Hmmm. Consumer Reports, who acknowledges that most of its subscribers use Apple products and uses subscriber surveys and not test results to rank devices. Or DPR, which actually tests cameras day in and day out. I know bias can taint any review but I suspect I also know which of these two organizations actually spent time to get to know the 1020 before writing a review and which just used it a while.
There is a second possibility. Perhaps Consumer Reports just got a 1020 that had bad video components. You'd think they might at least check on that, particularly when other non-partisan reviewers report the exact opposite findings. But keeping that survey of subscribers in mind, they don't want to go too far in praising something their readers don't buy.
Consumer Reports On-Line:
"Video performance was not that impressive, but it's adequate for casual use such as uploading to the Web."
Digital Photography Review Connect:
"The 1020 captures nicely exposed, fluid video in good light. Focus is solid. The image stabilized lens really helps keep the image steady. In low light, the 1020?s big sensor really shines, delivering remarkably ungrainy video with a lot of detail. Focus drift is minimal ? if it became a problem, you could lock it manually."
Hmmm. Consumer Reports, who acknowledges that most of its subscribers use Apple products and uses subscriber surveys and not test results to rank devices. Or DPR, which actually tests cameras day in and day out. I know bias can taint any review but I suspect I also know which of these two organizations actually spent time to get to know the 1020 before writing a review and which just used it a while.
There is a second possibility. Perhaps Consumer Reports just got a 1020 that had bad video components. You'd think they might at least check on that, particularly when other non-partisan reviewers report the exact opposite findings. But keeping that survey of subscribers in mind, they don't want to go too far in praising something their readers don't buy.