A U.K. woman was photographed standing in a mirror where her reflections didn’t match, but not because of a glitch in the Matrix. Instead, it’s a simple iPhone computational photography mistake.

  • Nine@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    2
    ·
    1 year ago

    So what was I wrong about? I’m always happy to learn from my mistakes! 😊

    Do you have some whitepapers I can reference too?

      • Nine@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        Gonna provide more information or is this just a trust me bro situation?

        • SpaceNoodle@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          1 year ago

          Not sure what I’d have to gain from just lying on the Internet about inconsequential things.

          Also not sure I can disclose too many technical details due to NDAs, but I’ve worked on camera stacks on multiple Android-based devices. Yes, there’s tons of layers of firmware and software throughout the camera stack, but it very importantly does not alter consequential elements of images, and concentrates on image quality, not image contents.

          While the sensors in smartphones might not be as physically large as those in DSLRs - at least, in general - there’s still significant quality in the raw sensor data that does not inherently require the sort of image stitching that Apple is doing.