Lecture 4

Understanding the Device

“If you want to understand the smartphone, you need to use the smartphone.” If you’re building an app for iOS you need to use and explore with an iPhone to gain an understanding on how the phone works.

  • First 5 seconds of a new phone – Apple users are learning the basics
  • 1. The screen is touch based
  • 2. User interface elements are touch based
  • 3. The user will need to use fluid gestures such as touch and swipe to engage the user interface elements
  • 4. The hardware buttons are secondary to the touch experience

GESTURES:

 “The Tap” – The pressure on the glass to contact the screen to open an app or object.

“The Drag” – the movement on the screen, an example is the slide open button the phone or the dragging an app.

“The Flick” – less movement, quicker movement

“The Swipe” – Used for menus and photos. A controlled and faster drag.

“The Pinch” – Zoom in & out on photos and maps

Random Gestures include the shake function

Part 2 UI –iOS Anatomy

  • Random Gestures include the shake function
  • Navigation bar – The top of the screen (actions buttons such as back)
  • Tab bar – the bottom nav bar (example: Facebook – news feed, messages)
  • Action Menu – AirDrop, share buttons, print etc.
  • Alert
  • Segmented control
  • Map view
  • The Tool Bar – Supports the current view and page (example: safari)
  • Screen Sizes
  • Inputs – Sliders and switch (Brightness)
  • The Keyboard
  • Pickers and Date Pickers

Summary: This weeks’ lecture focuses on the gestures and the simple understanding of the device you’re using. I learnt the first few seconds is important to capture the user. So you want to provide a simple experience so that you don’t scare the users off. This lecture was great for my app development in terms of using gestures such as swiping and the flick for a greater experience for your audience. I will be using these gesture to improve my app flow.

Leave a comment

Design a site like this with WordPress.com
Get started