Disclosure: HP is a client of the author.

Each summer, I get the chance to review a lot of laptops. I’m currently working on three such reviews, and one feature stood out as particularly useful for remote employees. With its latest Dragonfly G4 laptop, HP has come up with a unique solution to address two of the problems that often arise with being remote: eye contact and sharing.

And I think it could be expanded into something truly amazing.

I’ve been covering videoconferencing technology since the late 1980s. The big issue: how can a remote employee truly engage when they don’t have access to many of the tools or the proximity generally available in conference rooms?

When you’re talking to someone directly, you naturally look in their eyes. But when you’re working remotely, you might well be looking anywhere but at the camera, which makes it seem as if you aren’t paying attention. Several companies have already come up with a digital way to alter how your eyes appear, and as generative AI develops, we could eventually create avatars of ourselves that, regardless of what we’re doing, look attentive and engaged on the camera.

But what if you have multiple screens? In that case your head may be turned away from the camera, breaking eye contact and making it more difficult for the PC to accurately capture facial movements for that avatar.

The other problem is how to share something that isn’t digital. You could buy an overhead scanner, which is a dedicated camera to capture paper or small-objects on your desk. But they aren’t cheap, they take up a lot of room, and getting a videoconferencing system to see both the overhead scanner and your face could be an exercise in futility — particularly if the system you’re using is unfamilar.  

To address these problems, HP’s Dragonfly G4 provides a unique set of features called “auto-camera control” and “auto-keystone.” The first allows you to use two cameras (one can be the 5MP camera built into the laptop, but you can also use two external cameras), placing one on each of your two monitors. Then when you move from one monitor to the next, the cameras will see the movement and switch between them, much as what happens with TV news anchors. 

If you have something you want to share, you can move one of the cameras over what you are sharing and have an image of it — and your own face — appear in the same frame. You do need to select “HP Enhanced Camera” in the videoconferencing system, but this tool will work the same with all of them — so you don’t have to hunt down any specific settings (if they even exist) to get everything to work.

If you’re capturing an object, depending on the angle of the camera, you might have a "keystoning" problem (where the object appears bigger where it’s closer to the camera and is distorted due to the angle). The “auto-keystone” feature can adjust for this so the image isn’t distorted. (If you’re interacting with the object, the system might get confused, so you can also manually set and lock in the keystone image. 

The problem with video systems is that there is no common interface, so doing things such as sharing something physical while continuing to show your face can be annoying if you have to dig through various menus. But if you can alter how the laptop or PC camera works and select that camera, then these tools become effectively tied to the hardware, not the videoconferencing software; their use remains constant, regardless of the system you use. 

This opens the door for PC makers to address a problem that’s been around since we’ve had videoconferencing — the lack of an interface standard. If a PC takes more control of the various videoconferencing systems through its APIs, it would provide a consistent video experience. This would tend to create more loyal customers and a far more consistent user experience for the rest of us.

I think HP is onto something here. This Auto Camera Selection tool could finally solve one of videoconferencing’s most annoying problems: the lack of a consistent cross-vendor interface.

IT World