The risks from corporate use of activity trackers and other wearables is low, some experts say -- especially in comparison to all the other security and privacy risks CISOs, CIOs and IT folks must worry about.That said, as with any connected device, there is risk potential. For example, recent research suggests that devices such as Fitbits can be hacked (when the hacker is within close proximity). By focusing on accelerometers and other motion sensors, researchers at the University of Michigan and the University of South Carolina found that it\u2019s possible to, among other things, use sound waves at different frequencies to add thousands of steps to a Fitbit. (Scroll down to read Fitbit\u2019s response to the research results.)Here\u2019s what you should know about the security and privacy risks of wearables, and the best practices for minimizing those risks.1. Wearable security is a legitimate concernWith all the security concerns that enterprise IT already have on their mind, do they also need to worry about wearables?Yes, says Jeff Pollard, a principal analyst focused on security and risk at Forrester Research. For example, some fitness trackers can provide geolocation data \u201cminute by minute to the cloud,\u201d sharing employee as well as company locations. At the same time, \u201centerprise employees and consumers are opting in to data aggregation and analytics at a daunting scale,\u201d he explains.\u201cThough IoT devices and wearables don\u2019t necessarily create new security vulnerabilities, they reintroduce a lot of old ones,\u201d says Steve Manzuik, director of security research for Duo Security, a cloud-based trusted access provider. Such devices are \u201clike the wild West of easy hacking targets that many experienced with mainstream computing back in the 90s,\u201d he says.As with typical consumer IoT devices, wearables \u201cin most cases don\u2019t ship with built-in security and so they\u2019re vulnerable to being compromised,\u201d says Vinay Anand, vice president of ClearPass Security at Aruba Networks, an enterprise wireless LAN provider.\u201cFrom an enterprise IT standpoint, this could be particularly worrisome because of the channel wearables maintain with smartphones that adversaries could exploit,\u201d Anand says. \u201cAs the wearables are usually connected to a variety of cloud apps and, depending on an organization\u2019s BYOD policy, the corporate network, this can be a launch point for an attack. This means that malware and other forms of attacks can use that path to compromise the phone and then other resources inside the network. The attacker would have access to legitimate enterprise credentials that would lead to loss of, or the ransom of, sensitive data.\u201d2. In the scheme of things, wearable security may not be a huge concernTo put things in perspective, the security and privacy risks associated with wearables is \u201cquite low, but it escalates with the type of device,\u201d says Chet Wisniewski, principal research scientist for security software developer Sophos.\u201cPure biometric activity trackers like pedometers and heart rate monitors may leak information over Bluetooth but it\u2019s reasonably difficult to capture, and it\u2019s of little value to attackers,\u201d Wisniewski says. \u201cAs you move up to things like smartwatches the risk increases, but mostly due to trust and theft, not so much interception. A found smartwatch within a few meters of the paired smartphone could be used to steal emails and contacts. This risk may increase with some of the newer smartwatches that have an LTE connection, as they can operate away from the paired device.\u201dIn general, \u201cpersonal connected devices that primarily operate via close-proximity protocols, like Bluetooth Low Energy, and piggy-back onto mobile devices, such as smart phones, are generally less directly accessible for abuse, vs. IoT devices that are actively connected to the internet via Ethernet or Wi-Fi,\u201d adds Michael McNeil, global head of product device security for Philips Healthcare, which provides clinical healthcare systems and consulting.Many non-activity tracking IoT devices run on commodity hardware with firmware that\u2019s often not \u2018purpose built\u2019 and thus could expose extra services, such as SSH or Telnet remote administration or complex web application back ends, McNeil says. \u201cPersonal fitness devices are often very restrictive due to size and computing capabilities, with more specific engineering involved that provides less direct attack surface,\u201d he says. \u201cSo, much of the risk is usually with the security of the services that store and transmit this personal data to-and-from the mobile application or other means of data transfer\/functionality. Management of these risks should take into consideration these specific parameters of the IoT devices and their possible attack surfaces.\u201dIn addition, Fitbit, the leading wearable maker for corporate wellness, has much to lose if it doesn\u2019t take security seriously.According to IDC, Fitbit is still the top maker of activity trackers, though its lost some market share. The company also has a corporate division, Group Health, which offers wellness programs to customers such as Adobe, McKesson and BP. And Fitbit CEO James Park has said recently that growing its Group Health business \u201cis critical to the growth of the company.\u201dTo help safeguard against hacks and to protect data, Fitbit devices receive firmware updates that address security (and functionality) as needed and include built-in encryption when syncing data to the cloud, says Marc Bown, Fitbit\u2019s senior security engineer.Other security steps Fitbit takes include the following:Partnering with a customer\u2019s IT and\/or security team to \u201cproactively address any questions or concerns\u201d regarding the security of employee fitness and health data, says Amy McDonough, vice president and general manager of Fitbit Group Health.Offering an invite-only, bug bounty program to augment the research and testing that Fitbit\u2019s security response team conducts.Posting explanations of tracker firmware updates. Since spring 2016, Fitbit has also labeled client software updates that contain security fixes with a \u201cCritical\/Important\/Moderate\/Low\u201d rating to provide \u201cguidance for interpreting those ratings similar to best practices from Google, Microsoft, and others,\u201d according to a Fitbit blog post on security.Developing best practices around the activity tracking data employers obtain from employees who participate in Fitbit wellness programs.Fitbit says the recent hack conducted by researchers, manipulating its tracker accelerometers via sound waves, \u201cis not a compromise of Fitbit user data and users should not be concerned that any data has been accessed or disclosed.\u201d Fitbit, in an official statement, added that \u201cwe carefully design security measures for new products, continuously monitor for new threats, and rapidly respond to identified issues.\u201dWearable security best practices \u00a03. It\u2019s important to anonymize data \u00a0Companies that collect but don\u2019t carefully anonymize health-related data have effectively acquired what\u2019s known aselectronic Protected Health Information (ePHI), \u201cwhich puts you squarely in the HIPAA world,\u201d warns Eric Hodge, director of consulting at CyberScout, a data risk management and identity protection firm. And then, you must \u201cworry about complying with all kinds of HIPAA requirements just as a hospital would,\u201d he says. Plus, you\u2019re exposed to the same fines, which lately have been between $150,000 and $6 million, if you don\u2019t comply with HIPAA requirements. As a precaution, be sure to dissociate information about health and fitness from the individual, he adds.4. Segregate wearables on a different network IT should treat wearables like any other computing device on their network, Manzuik says. \u201cWhen possible, consider segregating IoT devices to their own network and don\u2019t connect them directly to the internet.\u201dBecause some IoT devices have \u201ca history of poor security,\u201d organizations should keep these devices on a dedicated network that doesn\u2019t provide any access to internal resources, such as a guest Wi-Fi network, adds Matias Woloski, CTO and co-Founder of Auth0, a universal identity platform.5. Do your due diligenceIs the IoT company HIPAA-compliant? Does it adhere to standards? How does it manage credentials and identity? Is there an easy revocation strategy in case a device is lost or stolen? These are a few questions CISOs should ask wearable\/group health platform providers, says Woloski.Corporate fitness and wellness programs are typically tied to third-party software platforms that request permission to access the data generated by trackers or other devices, Woloski adds. CISOs should look for wearable providers that expose their API using authorization protocols such as OAuth 2, so that users can stay in control and revoke access whenever they want, he says.6. Educate usersIt\u2019s important to educate users about the type of data wearables collect, where it goes, and how it might be used, notes Pollard. \u201cIt might seem like the data I share with a (wearable) app stays on my smartphone or wearable. In reality, it goes to the cloud and might be shared with a number of third parties. Less sophisticated users may never know that happens, or that they could opt-out of it when or if given the choice.\u201d7. Limit access to employee fitness and wellness dataTo run a successful wellness program or fitness challenge, an enterprise needs opt-in data from participating employees, such as how many steps they\u2019ve taken. But you should restrict wellness program data access to those who need it to run the program, advises McDonough.8. Get a clear picture of everything connecting to the enterprise network\u201cUnderstanding the full inventory of assets connecting to your enterprise network is critical,\u201d says Anand. \u201cYou can\u2019t protect what you don\u2019t realize is on your network, so a process to profile and set policies for all devices that wearables would connect to on your network is an important first step.\u201dWearables \u201cshould be treated as potential threats like any other computing device,\u201d notes McNeil. \u201cKeep an inventory of them, utilize mobile device management to understand which employees are using related mobile applications on their phone, and ensure that communications used by these devices and companion software are observed to leverage proper encryption over the network.\u201d9. Require multi-factor authenticationCISOs should require employees to use multi-factor authentication on their smartphones \u201cas an added layer of protection,\u201d Anand says. He adds to \u201cuse behavioral analytics to identify abnormal patterns of IT access and usage. At the first sign of suspicious behavior associated with a user\u2019s smartphone that is a known participant of a wellness program, IT can act to mitigate any potential damage.\u201d10. Prepare for security and privacy risks, especially in the short-termWe\u2019re \u201ca long way\u201d from \u201cIoT anti-malware solutions,\u201d notes Pollard. Wearables use a variety of third-party components, operating systems and software\u2014there\u2019s no standard dominant operating system, such as Microsoft Windows, to standardize or build upon, he explains.So, the road could likely be rocky in the near-term. Long-term, the security situation will improve, Manzuik says. But it could take a few high-profile vulnerabilities or hacks to get us there.