Input matters for Chrome OS | Session

Input matters for Chrome OS | Session


Input matters for Chrome OS | Session

In this Session, we give you a look at approaching app input given the increasing variety of device form-factors. Users are increasingly interacting with apps using a keyboard, mouse, musical instrument digital interface (MIDI), game controllers, and stylus in addition to touchscreens. We also discuss how to support these input sources and why it is essential for your app to do so.

Resources:
Input compatibility → https://goo.gle/2S4mODp
Handling different types of input → https://goo.gle/2R8ewKk

Speaker: Emilie Roberts

Watch more:
Web at Google I/O 2021 Playlist → https://goo.gle/io21-ChromeDevs
All Google I/O 2021 Technical Sessions → https://goo.gle/io21-technicalsessions
All Google I/O 2021 Sessions → https://goo.gle/io21-allsessions

Subscribe to Google Chrome Developers → https://goo.gle/ChromeDevs​


#GoogleIO #Android #ChromeOS



product: Chrome - Devices; event: Google I/O 2021; fullname: Emilie Roberts; re_ty: Premiere;


Content

1.67 -> [Music]
5.279 -> hi my name is emily roberts and i am a
8.32 -> developer advocate for chrome os today
10.88 -> i'm going to be discussing input and why
13.28 -> it matters to you as an app developer
16.08 -> there is a whole world of device form
18.4 -> factors out there foldables rollables
20.8 -> convertibles tablets laptop devices in
23.199 -> addition to phones
24.56 -> with all of these form factors come
26.48 -> different sizes and shapes of screens we
29.439 -> have a number of talks this year
31.039 -> specifically about large screens and
32.8 -> layout designs so check them out however
36.079 -> i would like to talk about the other 50
38.64 -> of the user interaction equation the
40.8 -> half of i o that is sometimes neglected
43.92 -> if output is the o then the i is input
48.239 -> so what's the big deal should an input
50.239 -> just work
51.44 -> sadly not always
53.199 -> many standard ui controls will work as
55.68 -> expected automatically for users across
58 -> devices but there's a limit to the
60.48 -> extent the framework can guess how your
62.64 -> app should respond
64.159 -> without proper input support your app
66.4 -> could seem broken
67.92 -> could be no fun to use could be
70 -> inaccessible to some users requiring
72.159 -> accessibility tools
74 -> and lastly your app could be missing out
76.56 -> on creating compelling engaging and
78.64 -> differentiating user experiences
81.439 -> so my call to action today is really to
84.88 -> think through input when designing your
86.96 -> app right from the base the reality is
89.84 -> that users are already using your app
92 -> with mice keyboard stylus and more and
94 -> not just touch
95.28 -> embracing input as part of the design
98.479 -> means your app will be more intuitive
100.159 -> and delightful to use
102.24 -> okay let's start simple and this is a
105.04 -> situation that arises often when an app
107.6 -> designed for mobile is run on a
109.28 -> chromebook or on an android phone or
111.36 -> tablet with a bluetooth keyboard
112.88 -> connected there is the text box the user
115.759 -> types a bit the user presses enter and
119.28 -> nothing happens
120.799 -> when a user's hands are already on a
122.719 -> keyboard they will assume the enter key
124.56 -> will work it is not intuitive to lift
126.719 -> your hands off and poke the laptop
128.399 -> screen
129.28 -> and don't forget some android capable
131.52 -> chromebooks do not have touch screens
134.08 -> the fix is easy and can make a huge
136.4 -> difference in terms of a user's app
138.16 -> experience let's take a look
140.239 -> here's the code you need to handle the
141.84 -> enter key press
143.28 -> first of all notice we are overriding
145.599 -> the activity's on key up method anki up
148.72 -> listens for when keyboard keys are
150.48 -> released there is a corresponding on key
153.68 -> down for when a key is pressed but in
156 -> many cases it is easier to use on key up
158.8 -> this is because anki down will detect if
161.12 -> a key is held down and will send
162.64 -> multiple events
164.56 -> next look at the event's key code to see
167.519 -> which key was pressed and released in
169.44 -> this case we're looking for the enter
170.879 -> key
172.08 -> then take action on that key code in
174.8 -> this case calling the send message
176.48 -> function this will probably be the exact
178.879 -> same function you would have called if
180.319 -> the onscreen button was pressed so there
182.48 -> is no need for a new code here
184.72 -> and that's it for triggering app
186.959 -> functionality but you will need to do
189.12 -> two more things
190.879 -> first be sure to indicate to the system
193.28 -> that the enter key was handled by your
195.36 -> app by returning true
197.76 -> likewise in the event that your app did
199.92 -> not handle a given event pass it up to
202.239 -> super to allow the system to take action
204.48 -> on it
205.44 -> for more information check out the
207.36 -> android documentation at
209.12 -> developer.android.com
212.319 -> so
213.04 -> it's 2021. if users have their hands on
216.159 -> a keyboard they will expect basic
218.4 -> keyboard functionality so the spacebar
220.799 -> should work for pausing and playing
222.319 -> media and undo and redo shortcuts should
225.28 -> exist when it's appropriate like in a
227.44 -> text editor
228.72 -> however i'd like to invite you to think
231.76 -> more deeply about your app's input and
234.72 -> think about how you could fundamentally
236.48 -> reshape the user experience
239.12 -> here is an example of mwm's edjaying app
244.16 -> [Music]
257.69 -> [Music]
266.479 -> hey
268.13 -> [Music]
279.199 -> let's talk about what we saw
281.12 -> edjang works with touch on phones and
284 -> tablets which is great for casual users
286.72 -> or even power djs who may want to use
289.12 -> their phone or chromebook while riding
290.72 -> the metro home to build out their set
292.72 -> for later that day
294.4 -> for someone who wants to get farther
296.08 -> into djing or who wants to do the music
299.04 -> for a party or a wedding midi
301.12 -> controllers are a great option they
302.96 -> offer a nice low latency tactile feel
305.84 -> and makes it easy to crossfade scratch
307.759 -> and apply different effects and filters
310 -> and this is all possible with the exact
312.4 -> same app as before
314.24 -> on the same device just now with the
316.8 -> midi controller attached
318.56 -> even pro djs might appreciate this
320.56 -> flexibility when they can't carry all of
322.72 -> their vinyl records around with them
324.88 -> and this is my favorite part a keyboard
328 -> is essentially a low latency tactile
331.36 -> input device right kind of like
334.16 -> a midi controller
335.919 -> edjang took the time to think about
337.759 -> chromebook users and realize that they
340 -> often have a keyboard and trackpad
342.16 -> already attached so why not turn those
344.88 -> into a built-in dj controller
347.759 -> they included keyboard mappings for all
349.759 -> the major actions and effects as well as
352.639 -> and this is cool track pad based
354.88 -> scratching and crossfading it's not
356.96 -> quite as fun as a full midi controller
361.039 -> but
362.08 -> it's a lot more portable let's look at
364.56 -> how to support these different input
366.24 -> methods in your app
368.16 -> midi coding can be daunting if you've
370.72 -> never worked with it before
372.56 -> luckily there are some handy open source
374.72 -> samples in the android media samples
376.479 -> library that can help you get started
378.479 -> check out the midi synth project at the
380.639 -> link here and in particular the synth
383.52 -> engine module in that project also just
386.319 -> a note if you use c plus plus
388.639 -> starting in android 29 so coming soon to
391.039 -> chrome os you can also use the ndk midi
394.319 -> api
396.16 -> adding keyboard functionality is just
398.4 -> the same as we looked at before with the
400.319 -> enter key here you can see checking for
402.479 -> the w a and l keys and the corresponding
405.68 -> app actions
406.88 -> as demonstrated in the app trackpad
409.36 -> input can add really cool
411.52 -> two-dimensional tactile experiences to
413.68 -> apps
414.56 -> here's a simplified example of some code
417.039 -> that would take trackpad input and
419.039 -> convert it into a control signal for a
421.36 -> record scratching function
423.199 -> first
424.08 -> look for generic motion events
426.56 -> then record the change in the x and y
429.12 -> position of the pointer
430.88 -> then calculate the hypotenuse to get the
433.599 -> total distance traveled by the pointer
436.08 -> and then do something with that info
438.4 -> here we are scratching the record
440.639 -> and that's it
441.759 -> we are going to improve this code later
443.68 -> on so stay tuned for that
446.319 -> with so many great input options another
449.36 -> issue arises what happens when users
452.319 -> switch between them maybe i start
454.479 -> playing a game in laptop mode then move
457.039 -> over to the couch and flip it into
458.88 -> tablet mode and then later on tint the
461.44 -> device put it on the coffee table and
463.84 -> attach a game controller or another
466.56 -> situation and i was fortunate enough to
469.28 -> have taught high school and witnessed
471.12 -> this first hand with my students
474 -> some people use their devices very
476.84 -> creatively twisted sideways using the
479.599 -> trackpad and touchscreen at the same
481.199 -> time and then occasionally hitting
483.28 -> keyboard keys too it was quite wonderful
485.919 -> to see but the question is how does your
488.8 -> app handle something like this
491.599 -> the number one thing is support your
493.52 -> users let them use whatever input
495.919 -> devices they want whenever they want
498.96 -> this means your app should always
500.879 -> respond appropriately to any supported
503.28 -> input
504.24 -> if a user has a game controller
505.919 -> connected they're clicking the keyboard
507.84 -> and the touchscreen all at the same time
510.24 -> no problem respond to it all as expected
514.159 -> things get more complicated however when
516.88 -> it comes to ui
518.479 -> there is a distinction between input
520.479 -> events coming in and what is currently
522.56 -> being shown on the screen
524.32 -> for example some touch-based games might
526.959 -> have an on-screen joystick like this
529.2 -> pretend racing game here
531.12 -> if the user is using the keyboard to
532.8 -> play and not touch
534.399 -> you probably don't want to use up screen
536.08 -> space with that on-screen joystick and
538.24 -> could just fade it out
540 -> there may also be situations when an app
542.32 -> or game has different prompts that are
544.48 -> dependent on input
546.16 -> for example press m for nitro with the
548.959 -> keyboard attached but press x with the
551.68 -> game controller and click here for a
554.48 -> touch interface
555.839 -> here's a look at a game that handles
557.6 -> this well dead cells
559.68 -> when using a keyboard and mouse the
561.76 -> prompts and text show the keystroke and
563.92 -> mouse button indicators
566.08 -> for touch input you can see the
568.08 -> on-screen joystick on the left
570.08 -> a touch icon to interact with items and
572 -> characters in the middle and on the
573.6 -> right the jump crouch and item touch
576.16 -> targets
577.279 -> finally for a game controller all the ui
580.08 -> elements correspond to the appropriate
581.76 -> controller buttons as the user expects
584.64 -> cool but how do you implement something
586.88 -> like this
588.399 -> one approach is what i call a lazy state
590.959 -> machine it is a prioritized state
593.2 -> machine that feels lazy because although
596.56 -> input events from all devices will
598.56 -> always respond immediately the ui
600.88 -> changes may be slower to transition
604.72 -> let's get concrete
606.56 -> here you can see a flowchart for three
608.64 -> input states touch keyboard and game
611.04 -> controller
612.24 -> decide on the priority of each input the
614.32 -> number 1 2 and 3 here
616.399 -> for the race car game example from
618.24 -> before or like with dead cells if the
620.8 -> user is using touch at all they need the
623.519 -> joystick in buttons on screen or else
625.92 -> the game might be hard to play even if
628.079 -> keyboard events are being received if
630 -> the user is touching the screen that
631.92 -> on-screen joystick needs to be shown so
634.959 -> the touch state gets the highest
636.72 -> priority number one
638.88 -> when does the game move out of the touch
640.56 -> state
641.76 -> if the keyboard is receiving input but
644.24 -> there hasn't been any touch events for a
646.079 -> while let's say five seconds it'd be
648.8 -> nice to fade out the unused touch
650.56 -> joystick and buttons to maximize screen
652.959 -> real estate so
654.88 -> keyboard input plus
657.04 -> no touch input for 5 seconds equals
660.16 -> transition the state machine to the
661.76 -> keyboard input state
663.92 -> again the moment there is any touch
665.839 -> input we should immediately move back to
667.92 -> the touch state
670 -> likewise if the game controller is
672.079 -> receiving events and neither the touch
674.16 -> screen nor the keyboard has received
676.32 -> input for a while the ui can move into
678.72 -> the game controller state
680.64 -> the instant there is any keyboard input
682.56 -> it should move to the keyboard state or
685.6 -> if there's any touch input it should
687.36 -> move to the touch state
690.079 -> this 5 second lazy delay before
693.279 -> transitioning to lower priority input
695.36 -> states prevents the ui from flickering
697.68 -> back and forth in the event of someone
699.76 -> using multiple input methods at the same
701.76 -> time
703.279 -> a last observation in this model the app
706.32 -> is only reacting to actually received
708.56 -> events so you're not trying to check if
710.959 -> there are keyboards or bluetooth
712.399 -> controllers attached or in any way
714.48 -> trying to guess how the user might be
716.56 -> interacting
717.839 -> trying to do that will not always work
719.76 -> and it will be slower to respond to
721.36 -> changes it is better just to act on real
724.079 -> received input events
726.32 -> okay that was a pretty quick overview of
728.48 -> this concept for more detail and code
730.72 -> check out our documentation on
732.32 -> chromeos.dev
734.16 -> on the subject of game controllers
736.16 -> chrome os supports all the button
738.079 -> mappings that android does including
740.16 -> xbox xbox 360 ps3 and ps4 based
744.079 -> controllers and in addition we have
746.56 -> added mappings for other popular
748.32 -> controllers like 8bitdo and logitech to
751.519 -> chrome os we then work to get these new
753.839 -> mappings ported back to the android
755.839 -> framework so other android devices can
757.92 -> benefit
759.68 -> actually handling gamepad events in code
762.32 -> looks a lot like the keyboard handling
764 -> code from before the difference is just
766.24 -> the key code
767.519 -> here's some code looking for the x
769.44 -> button and the left arrow on the d-pad
772.399 -> also with games you often want to know
774.399 -> if a button is being held down or get
776.72 -> that extra bit of responsiveness by not
779.04 -> waiting for a button release
781.2 -> in these cases you might choose to use
783.12 -> anki down instead of anki up as
785.839 -> mentioned before
787.76 -> handling game controller joysticks uses
790.32 -> a different method override than for
792.399 -> buttons and d-pads for that check out
794.88 -> the game controller documentation on
797.04 -> developer.android.com
799.68 -> an android feature that really shines on
801.839 -> chromebooks is pointer capture this is
804.48 -> when the mouse cursor is captured by the
806.959 -> app meaning it is no longer visible on
809.12 -> the screen
810.24 -> input events go directly to the app and
812.399 -> the cursor won't get stuck on the side
814.72 -> of the screen if it goes too far in one
816.24 -> direction
817.519 -> one game that demonstrates pointer
819.12 -> capture wow is minecraft education
821.36 -> edition which launched on chromebooks
823.36 -> for schools last august
825.839 -> during gameplay you can see the user is
827.92 -> able to direct their point of view with
829.44 -> the mouse or trackpad without a visible
831.519 -> mouse cursor
832.8 -> if you act with the building and
834.079 -> learning menus however the mouse pointer
836.399 -> reappears as you'd expect
839.12 -> here's some code showing how to
840.56 -> implement pointer capture you'll
842.48 -> recognize our dj scratching code from
844.88 -> before in the middle there
847.36 -> with our previous code imagine
849.44 -> scratching a track and seeing the mouse
851.44 -> cursor going all over the screen or
853.6 -> potentially getting stuck on the edge
855.12 -> and throwing off your groove
856.8 -> pointer capture is perfect for this use
858.88 -> case to hide that cursor let its
861.839 -> movement be unrestricted and still get
863.76 -> us the motion events we need
865.68 -> instead of on generic motion event like
868.16 -> before let's use a captured pointer
871.12 -> listener to respond to the motion events
874 -> the only difference here in the actual
875.68 -> scratching code is that the x and y
878.16 -> motion event values are relative to the
880.399 -> last motion events received instead of
882.959 -> being absolute screen coordinates like
885.04 -> before which makes sense as the pointer
887.6 -> is no longer bound by the screen
889.68 -> this means that the x and y values are
892.24 -> already relative deltas to the last
894.48 -> motion event so there is no need to
896.639 -> maintain and subtract the previous
898.24 -> values
899.279 -> the rest of the scratching code is the
901.12 -> same
902.24 -> when scratching should be triggered call
904.8 -> request pointer capture and when it is
907.199 -> finished call release pointer capture
910.079 -> that's it
911.199 -> for more information and samples check
913.279 -> out the android pointer capture
915.04 -> documentation
916.56 -> the last type of input i'd like to talk
918.72 -> about is stylus for drawing and painting
921.839 -> apps especially low latency stylus input
924.48 -> can make for an incredible experience an
927.04 -> app that does a great job of stylus
928.8 -> input is concepts they have low latency
931.68 -> and tons of great features
933.6 -> here you can see how the user is able to
935.44 -> use the stylus to have precise control
938.16 -> over the drawing and use some really
940.32 -> nice brush and nudge effects
942.8 -> do you want your app to have low latency
945.199 -> stylus support too
946.8 -> well i am really happy to announce that
949.199 -> our low latency api is now available in
951.839 -> alpha it has built-in configurable
954.959 -> prediction and supports both cpu and
957.6 -> gpu-based rendering paths please check
960.399 -> out the api in the demo app available on
963.12 -> github and file any issues or feature
965.279 -> requests on the github tracker so much
968.48 -> input so little time
970.8 -> that brings us to the end of this talk
972.88 -> and i hope i have convinced you to
974.8 -> really think deeply about input when
976.8 -> designing your app
978.399 -> for more information on all of these
980.48 -> topics and others such as supporting
982.56 -> large screens getting your app running
984.56 -> well on chrome os building web apps
987.199 -> optimizing game performance developing
989.68 -> on a chromebook and more check out
992.16 -> chromeos.dev
994.079 -> with that please enjoy the rest of io
996.8 -> and i hope to see you again really soon
1000.8 -> [Music]
1012.399 -> you

Source: https://www.youtube.com/watch?v=FPuaaYpUd5s