Featured Post

RedScarf II Build Log

A 60% board with integrated numpad

More ›

iCloud Call Log Sync


Earlier today, reports surfaced on The Intercept and Forbes claiming Apple “secretly” syncs Phone and FaceTime call history logs on iCloud, complete with phone numbers, dates and times, and duration. The info comes from Russian software firm Elcomsoft, which said the call history logs are stored for up to four months.

Likewise, on iOS 10, Elcomsoft said incoming missed calls that are made through third-party VoIP apps using Apple’s CallKit framework, such as Skype, WhatsApp, and Viber, also get synced to iCloud. The call logs have been collected since at least iOS 8.2, released in March 2015, so long as a user has iCloud enabled.

Elcomsoft said the call logs are automatically synced, even if backups are turned off, with no way to opt out beyond disabling iCloud entirely. “You can only disable uploading/syncing notes, contacts, calendars and web history, but the calls are always there,” said Vladimir Katalov, CEO of Elcomsoft. “One way call logs will disappear from the cloud, is if a user deletes a particular call record from the log on their device; then it will also get deleted from their iCloud account during the next automatic synchronization. Given that Apple possesses the encryption keys to unlock an iCloud account for now, U.S. law enforcement agencies can obtain direct access to the logs with a court order. Worse, The Intercept claims the information could be exposed to hackers and anyone else who might be able to obtain a user’s iCloud credentials.

Further, in a statement today, Apple said the call history syncing is intentional. “We offer call history syncing as a convenience to our customers so that they can return calls from any of their devices,” an Apple spokesperson said in an email. “Device data is encrypted with a user’s passcode, and access to iCloud data including backups requires the user’s Apple ID and password. Apple recommends all customers select strong passwords and use two-factor authentication.”

A lot of people are noting that Apple’s whitepaper on iOS security has a mention of call history in iCloud backups, I can’t find anywhere in said whitepaper acknowledging that call history/logs are synced using iCloud. Not sure if this is just an oversight in documentation, or a lack of transparency.

What’s curious to me is that Apple uses iCloud for this type of sync at all. The narrative is that it’s necessary to sync calls across all devices - my understanding of iMessage is that Apple keys each device and keys each message to be accessible by all devices. Why not employ a similar muli-keying for these type of sensitive records and use iMessage or something similar to enable the sync? Alternatively, if Apple is doing multi-keying for syncing these types of records, why doesn’t the whitepaper make mention of this?

[via Macrumors]


Typed on ErgoDox Test Board

Off Doesn't Mean OFF

Shazam is always listening

Patrick Wardle’s reversal of Shazam’s code:

TL;DR When Shazam (macOS) is toggled ‘OFF’ it to simply stops processing recorded data…however recording continues

Once installed, Shazam automatically begins listening for music, “ready to name that tune at a moment’s notice.” This song identification or “auto tagging” (in Shazam’s parlance) is of course is the main functionality of the tool.

Most (security-conscious) users probably don’t want Shazam listening all the time. Shazam appears to oblige, seemingly providing an option to disable this listening:

However, sliding the selector to ‘OFF’ did not generate the expected, “Mic was deactivated” OverSight alert. Odd :\ …though this did match what the OverSight user reported to me.

My first thought was perhaps OverSight had ‘missed’ the Mic deactivation, or contained some other bug or limitation. However testing seemed to confirm that OverSight works as expected. For example, when one quits (exits) Shazam, OverSight does receive a “Mic Deactivation” notification from the OS, and alerts this fact to the user:

So is Shazam still listening even when the user attempts to toggle it to ‘OFF’?

Again, though it appears that Shazam is always recording even when the user has toggled it ‘OFF’ I saw no indication that this recorded data is ever processed (nor saved, exfiltrated, etc). However, I still don’t like an app that appears to be constantly pulling audio off my computers internal mic. As such, I’m uninstalling Shazam as quickly as possible!

From Digital Journal:

“There is no privacy issue since the audio is not processed unless the user actively turns the app ‘ON,’” James Pearson, Shazam’s VP of global communications, told Motherboard in a statement. “If the mic wasn’t left on, it would take the app longer to both initialize the mic and then start buffering audio, and this is more likely to result in a poor user experience where users ‘miss out’ on a song they were trying to identify.”

A later statement from Shazam corroborates what Pearson stated, and what Whardle found in his reversal of Shazam’s code.

“Contrary to recent rumors, Shazam doesn’t record anything,” the company said. “Shazam accesses the microphone on devices for the exclusive purpose of obtaining a small fingerprint of a subset of the soundwaves, which are then used exclusively to find a match in Shazam’s database and then deleted.”

Recording or no, one would expect the off switch behavior to stay true to expectations - to be an actual OFF switch. While Shazam may not record audio while the microphone is active, it still presents another attack vector that a malicious actor might use to coopt the microphone.

[ via Objective-See & Digital Journal ]


Typed on ErgoDox Test Board

Phillips Hue Motion Sensor


Phillips’ Hue Motion Sensor seems like the perfect way to automate a great smart lighting platform, but is it?

I’ve been using SmartThings (with a 1st gen hub) for some time, but lately I’ve been interested in moving to a platform that supports HomeKit - enter Phillips Hue. My biggest barrier to moving to an alternate platform has been a distinct lack of motion sensors. It’s pretty hard to have ‘automation’ without an automated way to trigger lights.

Phillips Hue is the natural choice for a HomeKit compatible setup. I’d been waiting for some time for the motion sensor to come out. 2 weeks ago I picked up an entire rig:

I found the initial setup simple enough. Connect the bridge to my network, configure up using my phone - easy. I set about placing bulbs & sensors where I’d want them - namely the bedroom & bathroom. In those rooms I want to just walk in and walk out and let the lights take care of themselves. For other rooms in the house I’ll trigger scenes manually - like in the living room.

adding hue motion

[Source]

Adding the bulbs & accessories was dead simple too. Each device was quickly added and configured. Devices were also easily added to HomeKit.

adding hue motion

[Source]

All that was necessary was to scan a code provided on the Hue bridge’s box, then all my devices were present in the Home app on my iPhone…except they weren’t.

strike 1

As it turns out, the motion sensor is not HomeKit compatible. Little did I know when I made the purchase, but for whatever reason Phillips did not see fit to do the needed work to make the motion sensor compatible. In theory they could enable compatibility in a future software update, though HomeKit devices require certain encryption hardware so I’m guessing the requisite hardware simply isn’t present in the sensor.

HomeKit compatibility or no, I pressed on. I figured that I wouldn’t strictly speaking need HomeKit for triggering motion, I could just let the Hue app control that trigger, and worry about setting scenes for my other rooms manually via Siri.

On to the sensor itself. This sensor is quite impressive. Not only does it sense motion, it senses luminance as well - this gives it the ability to determine how bright to set the lights depending on the ambient light in the room. Great! Right way I noticed that this sensor is significantly more sensitive than the Ecolink sensors I use with SmartThings currently. Often my Ecolink sensors fail to sense motion quickly, requiring me to wave my arms about to get them to trigger, the Hue sensor on the other hand caught the motion every time. Even better!

lighting depending on hours

[Source]

So I’ve got a great platform that pretty much just works (unlike SmartThings), and really high quality sensors. Everything should be perfect and happy, right?

No.

strike 2

As it turns out, Phillips’ software controls for the motion sensor are really dimwitted. While you’re able to set light sensitivity and define how bright you want the light to be depending on the ambient light and time frame, they missed the most important thing with a motion sensor: the ability to override it. At time of writing there’s no way to have a scene take priority over the motion sensor.

If I’m in the bedroom reading the lights turn off every 5 minutes (or whatever timeframe you define), meaning I have to wave my arms around to turn the lights back on. With SmartThings (and presumably other systems) you can define a mode that overrides the typical sensor behavior. Unfortunately there’s just no way to do this within the Phillips app. Presumably using the Home app on iPhone one could define a scene like this, but as said above, this sensor isn’t HomeKit compatible.

There’s no doubt, of the limited lighting systems I’ve played with, Phillips Hue is no doubt the best. The bridge & devices are fantastic. They’re so easy to add, and the lighting control was basically instant (unlike the multi-second delay I get with SmartThings), but the inability to control that sensor as much as I like is a deal-breaker. Phillips might add this functionality, they might not, but I’m not going to wait around to see if it happens. The whole rig was returned the next day.

The good news is that Elgato has announced a HomeKit compatible motion sensor. It’s due out before the end of the year. Perhaps once that is available I’ll try this again. I do have somewhat of a sour taste in my mouth, but I’m hopeful that a Philips + Elgato + HomeKit solution will be sufficient to replace my aging and slow SmartThings setup.


Typed on ErgoDox Test Board

iPhone 7


Got my Jet Black 7 yesterday. Fantastic so far. I’m in love with this phone.


Typed on MacBook Pro

Studio & Touch Bar


[Source]

SixColors - Perpendicular Philosophy

Microsoft believes that traditional computer interfaces and modern mobile-device touchscreen interfaces should be melded together, blurring the lines between tablet and PC. This week’s introduction of the Surface Studio—think of an iMac that can be folded down onto your desk and used as a gigantic iPad—is perhaps the most impressive iteration of that belief to date.

Apple, in contrast, believes that touchscreen interfaces are great and computers are great and they’re not the same thing. Apple has steadfastly resisted adding touchscreens to the Mac, and when you ask the company’s executives why, they have been remarkably consistent on this point for the past few years. What defines a computer, they’ll say, is that it’s made up of two perpendicular surfaces.

There’s a vertical display surface, more or less up and down, right in front of you. And there’s a horizontal control surface—a table or desk or the base of a laptop—that you use for input and control. If you want a Mac, that’s what you get. If you want a touch-based device, get an iPad.

I got the opportunity to play with a Surface Studio at a Microsoft Store yesterday - the machine is absolutely incredible. Incredible!

Surface Studio brings a true drafting table model to consumers. My employer is doing a refresh for our artists soon, I’ve already asked that we get at least one for experimentation. I think it will prove popular.

I’m excited to try the MacBook Pro with Touch Bar. It’s a marked difference in computing perspective from Microsoft. Apple has created a technology that will prove valuable for both creative professionals and regular consumers.

It’s great to see both innovations from both companies.

One small thought though: $1799 for the base MacBook Pro with Touch Bar hurts. For the average Apple consumer that’s still spendy.

Update 11/1

I got the chance to play with the non-Touch Bar 13” MacBook Pro this afternoon.

The machine is noticeably thinner and lighter than the 2015 product, which is nice - but I’m not sure it was needed. I’m impressed by the keyboard, it’s a noticeable (although minor) improvement over the 12” MacBook Nothing keyboard. I can’t say I was the biggest fan of the 2015 product’s keyboard, but the one on this new machine is decent. Also, the new trackpad is GIGANTIC! No, really it’s huge. Take how big you think it is - now make that 30% larger - that’s how big it is.

I’ve been waffling back and forth for ages about which of the Apple portables to get when I replace my machine next. I think at this point I’m firmly in the Macbook Nothing camp. This isn’t a knock against the new Pro at all, but it has become clear to me that the Macbook Nothing is the more appropriate machine for me.

Update 11/21

Played with a Touch Bar MacBook Pro over the weekend.

Quick Thoughts:

  • The touchbar itself is matte, surprise.
  • Speed is great. It changes views just as fast as you change apps.
  • Swiping on the touchbar to adjust volume/brightness is just fantastic.

I only had 5-10 minutes with it. I’m not totally certain what I think. Touch bar strikes me as something interesting, but not something the new MacBook Pros needed.

Within the stock apps there is great functionality, but we’re obviously going to need 3rd party developers on board as well. Some of the interfaces (like adjusting volume) are fantastic, others are just downright confusing. I found myself getting a couple menus deep, then I’d forget how I got there.

Is it interesting to have right now, sure? Do you need to spend $299 more to get it on the 13” product, no.


Typed on Octopage