My iOS 10 Wish List - Part 1
I have read a lot of iOS 10 and WWDC wish lists post but none of them have impressed me. Perhaps this is because bloggers are trying to nail the tentpole features of the soon to be announced operating systems, but instead of focusing on just those features, I am going to list everything, BIG or small that I can think of in no particular order. I am sure this list contains some of those proverbial "Nos", but this is my list of where I would like to see iOS go next and in the future as it continues to evolve in its role as a super computer in your pocket. Here we go...
Camera and Photos
As a photographer somewhere in between a hobbyist and professional, I have always found the iPhone as a camera an intriguing device. Year after year Apple does incredible things with their camera hardware and software. We are reaching the threshold of things a single small lens camera can do, but there is still room too improve and multi-lens cameras are on the horizon. It's not hard to imagine the iPhones pairing of hardware and software working together only continuing to gain ground on SLR and Mirrorless cameras and photo management software.
Raw Photos and APIs to support editing Raw Photos is another way Apple can further improve the images you can capture with an iPhone. iOS has a plethora of amazing photography apps that are often unique to iOS. Adding Raw Photo capture and editing APIs would greatly increase the quality of those photos and edits on iOS. Apple has already figured out a great system to keep these edits non destructive with extensions and iCloud Photo Library, and the obfuscation of files enables quick conversions of Raw photos into JPEG when sharing to social media apps and the web.
Camera+ is already allowing saving images as TIFFs, but you have to dig to find the setting. I prefer Apple's Camera.app interface and the Camera+ workflow is probably a bit daunting to most users.
Image Recognition - Faces, Places, & Things
Apple's stance on privacy may have swung too far to do completely open machine learning on photos, but I think they need to use up some of that user data privacy trust they have been building to work on image recognition. Google Photos has shown what this kind of machine learning can achieve with its fantastic search, but the problem with Google Photos is the Google brand. If Apple were to do image recognition, users would expect the information gained to directly benefit them and the photos to be secure, rather than the potential for photos to be used to improve ad sales. I would absolutely opt in to this.
Follow Up - Tags, Keyword, and Metadata Editing
With the recently launched and excellent Metapho 2.0 came the ability to do most of the tasks listed, however, as a natural extension of making your photos searchable, the data behind them should be editable as well. With my iCloud Library of over 20k photos, Siri is going to need some information about what photo you are requesting, so we will need a way to edit it.
Complete Map View in Photos
One of my favorite features of Photos is to look at the map of where I have been in a given time period. Photos already lets you see the map view, but the caveat is that you can only zoom out as far as years or for specific locations in search. Let us explore our experiences all together in the same view.
Additional Shooting Modes and More Controls
Apple seems to introduce a unique shooting mode each year, and they have perfected simplifying complex shooting modes into the swipe of a finger and a tap of the shutter button. These modes have enabled people with little technical understanding to capture panoramas, slo-mo, time-lapses, HDR, and Live Photos. However, I would like to see more modes added either to the swipe menu (or another mode picker) or enable unique capture modes with manual controls. Slow shutter is one such mode that is simple but would enable a greater breadth of photo capture (fireworks and light trails, long landscape exposures, star trails, etc.) that aren't currently possible with Camera.app. Apple even opened up manual controls for camera apps with iOS 8, but didn't add much manual functionality to their own. These controls could be hidden like the option to turn on the rule of thirds grid and video quality in Settings. When enabled, a bi-directional swipe system on the screen could keep the interface simple as many other camera and editing apps have already done. The interface could also be cleaned up by merging shooting modes into something similar to the filters button. Finally, adding in smart shutter capabilities could help improve everyone's captures by enabling shutters such as big button (tap anywhere on screen to capture image - great for "+" phones), leveling (capture only when camera is level), or anti-motion (capture only when the phone is still enough for lighting conditions and settings). These features will only make the millions and millions (or billions and billions) of photos taken on iPhones even more amazing.
More Editing Tools in Photos
Another thing Apple introduced with Photos in iOS 8 was a remarkably simplified way to edit Light and Color in photos, while still letting more experienced editors fine tune their shots. The problem is, not all corrections are as simple as light and color. I would love to see a "Detail" slider that corrects for sharpening and noise, as well as making photos softer or more dramatic. Improve the color slider dropdown options to enable advanced color corrections. Steal Adobe's new content aware cropping. Bring over the retouching brush as well as maybe a few other brushes like burn and dodge. Pair this with Raw photo capture, the built in non destructive editing, and the best photo app ecosystem in existence, and iOS becomes the go to place to capture and edit photos.
With the revamp to Notes.app and the iCloud backend last year I was almost an instant convert. I am not lying when I say it helped me be more productive than ever through by far the hardest year of college. Greedily, I want it to continue to improve rather than go years without further improvements.
Image and Document Markup
One baffling omission from Notes.app last year was the ability to mark up images and documents within your notes, while adding fantastic yet simple drawing tools. If I only got to choose one feature to add to Notes this year this would be it (though I have plenty more). I would love to see the drawing tools be able to markup images and documents rather than simply the blank canvas we currently have.
More Drawing Tools and Drawing Tool Extensions
On the heels of asking for the drawing tools to be able to markup documents, I would love to see more drawing tools. Paper really nails a lot of these tools in a fairly simple interface. In addition they could add markup tools which they already have in Mail.app to round out their annotation tools. On top of adding native tools, they could also create an extension point that lets developers add drawing, annotation, OCR, and PDF markup tools within notes itself to further extend the types of notes one can capture.
Note Types and Formatting Options
Not all notes are created equal, and we need to remember and recall all sorts of information. Yes there are "Apps for that", but having a single environment to put all the information you want to remember in makes searching easier. Note types could include things like a simple sticky note (colors could correspond with tags), data tables, marked up documents, sketches, and lists. The iOS version also needs font formatting to change size and color like its Mac counterpart.
Tags and Better Note/Folder Management
Adding a little metadata via tags would also be a great addition. Tags could match the OS X (macOS) version, sync via iCloud, and be pervasive throughout both operating systems adding an easy way to find all documents and information relating to that tag.
One of the features that sold me on Notes versus Evernote was the ability to infinitely nest folders. However, you can only go multiple folders deep on the macOS version of Notes. I would like to see this feature be implemented to not only Notes, but Photo folders, iCloud app folders, etc.
Another glaring omission was the ability to share Notes and Folders with others. This may be an upcoming feature but it seems odd in light of the ability to share calendars, reminder lists, and shared iCloud albums (which I would prefer be part of iCloud Photo Library in the Photos' Albums tab). Sending carefully formatted Notes to friends and family shouldn't result in that formatting being lost, and being able to share within Notes.app solves the problem of lost formatting.
The addition of password protected, encrypted notes in iOS 9.3 was a great surprise and welcome feature for all of us privacy minded nerds who watch family members store sensitive data and passwords in Notes.app. However, only Notes containing text, photos, and sketches are able to be locked. So storing sensitive documents in Notes doesn't currently work, and leads to your outboard brain being fragmented across apps.
Additional Card Data Types
Currently Messages displays cards for some but not all data types. If it can be shared via messages and isn't plain text, the attachment should be able to display in a nicely laid out card. This could include things like notes as mentioned above, webpages, weather information, and potentially 3rd party defined data via an API for defining data types. With the importance of understanding what types of data apps can provide and handle being necessary for Siri, Apple could extend these definitions into visual representations of their data and kill two birds with one stone. Imagine sending a Dark Sky forecast to a friend that is also viewable when you ask for the weather in the Siri interface. Not only could rich visualization convey better information, but it could additionally lead to support of developers via app purchases.
Stickers and Attachments
The current state of 3rd party keyboards is divided into two very different functions. The first is what was intended by Apple when enabling keyboard extensions, and that is an alternate way to enter text into the message (or data field). The second use of keyboard apps is to add attachments into your messages, but the process of toggling through keyboards is convoluted and always leads me to uninstalling all 3rd party keyboards. I have seen a lot of suggestions for how to fix 3rd party keyboards, but I think the model is simpler than a lot of these proposed solutions. Allow for text input keyboards to replace the default keyboard, and add in an attachment extension for stickers, gifs, and other attachment keyboard types. This could include or replace the camera (though I would leave it separate) and the audio attachment buttons already in place. Security prompts for these two different systems could reflect the need of the extension and would make me feel better about granting Refresh rather than Full Access to a random gif keyboard maker. Then we can all send our friends and families Kimoji stickers and not have to worry about ever finding our default keyboard again.
While we are adding attachments, I think instead of shooting the dead horse of Digital Touch on Apple Watch, Apple should include the drawing feature as an additional native extension along with photo, video, and voice. There are 3rd party apps that imitate this feature, but if implemented natively we could then receive the digital drawings on our Apple Watches without them looking like bad stick drawings.
Apple Pay Money Transfers
One of the Tent Pole rumored features of iOS 10 that I am completely on board with is Apple Pay enabled money transfers via iMessages. I am a big believer in food and drink karma, but it never hurts to be able to easily send cash to friends and family. Full steam ahead on this one.
I am a diehard RSS reader but I like the idea of a news or article reader that enables me to discover more about the topics I am interested in even if they aren’t linked within an article I have read. However, News is currently limited to a single thread. Not only does this make curating my own discovery on a topic difficult, but it adds noise to that process. One way to solve this would be to allow users to make folders or “Magazines” of sources and topics. For instance if I want to read up on photography or backpacking, I could place certain sources and topics within that folder, which I could choose to read from with a dropdown at the top of the For You screen. This way I can discover great backpacking tips without seeing a headline about Donald Trump, which makes me instantly want to leave the app. Apple could anonymize data about the stories I like that are grouped together and use it to make suggestions to other users with similar interests.
Improved Topic Recognition and Discovery
In addition to magazines, topic recognition and discovery could use some work. Since everything in News is being fed into the app purposely, Apple should absolutely be using machine learning and human oversight to classify the basic topic of an article. I have seen some terrible unrelated keywords attached to articles and that does not give me much confidence that it will provide me with quality recommendations.
Once we have solid classification of what an article is about, it would be a boon to be able to mute a topic out of your News feed. The Internet can become flooded with a single topic such as an election, celebrity death, missing airplane, or sporting event and seeing endless articles about the same topic that you are uninterested in can be frustrating. Simple topic mutes would solve for this, and a modal pop up from tapping on the topic box could "snooze" the topic for varied intervals of time or indefinitely
Reading List and Read It Later
Apple News already comes with a bookmark feature which is probably a part of the recommendation engine, but there is potential for it to learn from other sources as well that perhaps aren’t in Apple news. Apple could integrate Reading List into its recommendation engine, and potentially create a share extension that enabled you to feed items and topics to it to further learn about your interests.
Read To Me
Like M.G. Siegler, I listen to a majority of what I “read” on the internet. Apple’s addition of the Speak Screen gesture has only amplified how much I listen, as it makes enabling text to speech a breeze. However, News is not optimized for the feature and the page frequently jumps side to side if you are trying to follow along. I would love to tell Siri to "read this article to me", completely hands free, or for there to be an action extension to Read to Me. It would be another huge accessibility win for Apple.
More to Come
I use iOS a lot and I think there are tons of small and subtle features that could be implemented to make it an even more robust pocket operating system. With the dawn of Extensions on iOS 8, the possibilities of what iOS can do seem endless. I have argued that the sensors and additional hardware parts in an iPhone (or iPad and eventually the Apple Watch) makes the potential for what can be done on an iOS device an order of magnitude higher than that which can be done on a traditional computer. I hope you have enjoyed reading, please feel free to comment if I have sparked any ideas about the future of iOS in your mind. Stay tuned for part 2...