B1 Intermediate Other 2253 Folder Collection
After playing the video, you can click or select the word to look it up in the dictionary.
Report Subtitle Errors
JON WILEY: Hi everybody.
So during the keynote you heard a little bit
about material design, and we hope
to give you a little bit more detail about that today
and in the sessions that follow tomorrow.
But first I want to tell you a little bit
about our inspiration around material design.
Every object in this room owes its origin
to a few people throughout the millennia who
paid careful attention to their environment.
They sought out the very best materials,
and they learned their properties.
And they used this knowledge to make things.
And when you consider making things, the design
and the manufacture of things, we
inherit thousands of years of expertise.
In contrast, relatively, software design
is just getting started.
Much of interface design concerns itself
with what people see.
But with modern high resolution displays
coupled with the ability to physically interact
with the software your expectations are much greater.
In fact, there's thousands of years of expectations.
And so we took a step back.
We looked at all of this software
and we asked what is this made of?
We challenged ourselves to define the underlying
physics of our interfaces, and craft a visual language which
unifies the classic concepts of good design
with a solid understanding of the most
fundamental physical properties.
At first we thought like designers.
How should it appear?
How should it look?
But then we thought like scientists.
Why does it behave this way?
And after many experiments and many observations
we wrote down everything that we'd learned.
These are our material principles.
In Android 4.0, Ice Cream Sandwich,
we introduced a typographic magazines style UI,
and a lot of people liked it.
We were pretty happy with it.
But design is continually evolving.
Users are getting more sophisticated.
The design landscape is more sophisticated.
In particular, motion has become incredibly important
over the last few years.
We wanted something that was taking
the very best of graphic design clarity,
and the innovations in motion graphics,
and motion communication, but that still
taped into those elements of tangibility, of physicality
that industrial designers themselves use.
So this led us to a question of how do we do this?
And the very first principle in material design
is metaphor, which seems a little random.
Why metaphor?
Metaphor is basically a very, very short story.
And like stories, metaphors are really powerful because they
are deep and dense in meaning.
They communicate more richly than verbal language can.
If I'm writing a play or if I'm telling you
about a character or a person in real life,
if I say she was a hurricane, I don't
have to tell you about her force of will
or her indomitable spirit.
I don't have to tell an actor that averting her gaze
would be inappropriate.
The metaphor is a form of knowledge transfer
that depends on shared experience.
And in fact, this capacity to transfer knowledge
and to transfer learning is one of the things
that defines humanity, and, in fact, defines intelligence.
So for us the idea of metaphor is a backstory for the design.
It unifies and grounds the design,
and it has two functions.
It works for our audience.
We want to present a metaphor that they can understand,
that they can connect with, that they can use to move faster
into understanding how to use things.
But it's also a metaphor for ourselves,
for the designers, and the developers, and the PMs,
and the QA people, everybody working together,
because when you have a metaphor that everybody understands,
intuitively understands, you don't
have to explain how they violated subsection C, clause
2 of your style guideline.
They just know it feels wrong.
They know it's out of place.
So why this particular metaphor?
Why did we imagine a material that
was a form of paper sufficiently advanced
as to be indistinguishable from magic?
Well, one part of it is of course
that we do have a lot of experience
as humanity communicating with paper.
Paper is just rich in a history across all our cultures
of conveying information, and it naturally
affords so many different ways of interacting with it.
But the other aspect of paper is that it is a physical thing.
It exists in the world.
And this idea that surfaces because they
are tangible are a metaphor that we can use to accelerate
understanding of our UI is really important.
You have this perception of objects and surfaces
that's happening in the more primitive parts of your brain.
It's happening in these visual cortexes that
are in the back and lower parts of your brain.
And that means they're simply easier than language.
They are more natural than language.
You have this inherent understanding
about the separation of things and the relationships of things
that allow us to look at this and have it make sense,
even though we know there is no material in the world that
could possibly do this.
It is irrational and yet feels completely natural.
And that's what we want when we're
creating digital, magical interfaces.
Because we are not constrained by the laws
of the real world in our user interfaces.
Surfaces are intuitive.
And that's why we use them as the foundation.
They organize space and rationalize the interaction.
And it matters that you preserve this inherent sense
of what's right.
Not for the sake of artifice, but in order to make the mind
work less.
One of the things you'll discover in our material design
documents is that our buttons rise to the surface in response
to touch instead of syncing into a surface,
like a fake, plastic button would.
And we do this because we want this illusion to be consistent.
The idea that when your finger touches
that glass on your phone that surface
is coming up and meeting your finger
at the point where it touches.
JONATHAN LEE: Content is bold, graphic, and intentional.
We pushed ourselves when we were thinking about material design
to make clear and deliberate design decisions regarding
color and topography.
So embracing these classic visual design
principles Jon and Matias have both
spoken about in our new framework.
With Ice Cream Sandwich Android introduced
a new system font, Roboto.
And today we're updating Roboto to go beyond phones
and tablets, and be the default typeface for the material UI.
Here you can see the current version of Roboto.
And Roboto is now slightly rounder, friendlier,
and most importantly, redrawn to perform on desktop and beyond.
And I can tell you from personal experience
that it handles really well.
We also developed and documented our design guidelines
for typographic scale.
By using one typeface and a handful of weights for emphasis
we believe that it delivers consistent and clear
hierarchies in your products.
Using scale and appropriate display sizes
you can create dynamic, print like layouts
with using white space and contrasts.
This focuses your users on the content that really matters.
Using vertical key lines and baseline grids content
is bold, elegant, and graphically clear.
We also developed a complete color palette
with intentional color decisions for shades, tints, and accent
These are not just adding white and black to a color,
or using alpha.
We actually looked at each of these shades
and decided what they should be.
So strong and intelligent application of color
gives life to your UIs.
And it connects users with your brand.
It also can create very strong hierarchy
and liven up some pretty standard UIs.
As you can see in this example, it's
essentially some kind of form that you're filling out.
And there's a clear area for your title,
and that's what we want people to focus on.
Dynamic color is also really exciting.
Earlier today Matias announced a new API in the L preview
called Pallete.
This system really makes it easy to color things,
dynamically select, and extract color for use.
One of the things that you could use color for
is contrast accent colors.
Contrast colors make this floating action button,
which is the play/pause button really pop.
Brand, color, and icons are accelerators
that guide users through your design.
When think about app icons we looked at real life studies
in lighting and shadow.
We started by defining a universal grid
for app icons, one that supports primary geometric shapes.
A product icon is the first literal touch
point of your user's experience.
We looked at extracting those attributes from your icon,
and from your brand, and intelligently applying those
to the surfaces within your UI.
The top toolbar and the floating action button,
again, are accelerators for those actions.
Here's just another example of how
to connect those services to the product icon.
And finally, we took the same modern, bold, geometric
approach from app icons and applied it to the UI icons
you see in your app.
Our design teams are now using one common set
of icons between Android L, Polymer, and the web.
This means one trash can across all devices.
And we'll be releasing these icons later this summer
through our design guidelines, available for use
on both Android and the web.
So even if we're doing all this, and we've got great typography,
as an industry we're leveling up when we start using baseline
grids, we've got amazing color, it's
not enough just to draw the static states and stop there.
We can do more to show people how the states are changing,
letting them focus on the content that matters.
So when you think about it, almost all state changes
in the UI start with user input.
And this is why material UI puts that user input
at the center of the animation story.
So when I touch the screen immediately
the surface responds.
I can see exactly where I touched,
and the intersection of the object that's responding.
This does two things.
First, it tells people that they've been heard.
They feel powerful.
They feel like they're in control.
Apps feel responsive.
Second, it confirms to them that the UI
is going to do the thing that they expected,
that it's working.
This animated touch feedback is now
built into the platforms for both Android and Polymer,
and it's being used by all of the relevant components.
So it's not just the immediate feedback
though that's centered on user input.
The material itself transforms to bring in new content.
And all this happens on a shared stage.
When I say I shared stage, I'm talking about the environment
where this material lives.
It's important to know as we're looking at the material
that it lives in the same scale and in the same space as we do.
We're not flying through some imagined space,
or looking through a window into another world.
This material lives at the same scale as the device itself,
whether it's in our hand, or we're
looking at it on our desk.
We don't move.
The material does to bring the content to us.
You can see how this works as transitions organize themselves
around the object as it's selected.
The material moves, expands to reveal the new content.
And notice that even as the content transforms in a way
that maybe a physical material like paper
wouldn't, it's still very clear what's
happening because of the way the material responds to light,
and the way the shadows are being rendered by the platform.
So animation is crucial to reinforcing this metaphor
that Matias talked about.
It's just in the same way that shadow rendering
helps us understand what we're looking at.
The way that things move give cues about how they work
and about their properties.
So, for example, the material animations naturally
show continuity from one state to another.
When I select an item it expands to the next state.
It doesn't jump cut.
It doesn't blink in and out.
It moves naturally like sliding a piece of paper
across the table.
If we teleport our users from one state to another in a UI
it can be confusing and disorienting
when they get to the other side, almost
like if we were to be teleported in the real world.
If I were to just appear on the stage in front of everybody
here, it'd take me a few moments to get my bearings.
It's the constraints that are inherent in the material that
make it clear for people what can happen
and lets them predict and understand what has happened.
So if it makes it easier to understand what's
changing, at the same time it can show us what's important.
So since our eyes are naturally drawn to motion,
if something moves, and it's in our field of view,
we're going to see it.
It's a really strong tool for us to help direct focus.
If in a music screen the player controls
are the primary interaction, the animation can point that out.
Also, even noticing the details, those small things
that you might not even notice overtly, like, for example,
the small slide on the control slider as it comes in.
Wait for it.
There it is.
Even though people might not notice it overly, they see it,
and they know how things work without having
to think a lot about it.
NICHOLAS JITKOF: So these guys talked about many
of our core principles, primarily
this sense of tangible surfaces, which
appeal to the primal parts of our minds
with dimension and shading.
Bold and intentional design, which provides
a unified language that allows brand and UI
elements to stand out.
And meaningful motion, which fuses
this design and the surfaces together, and gives
new affordances and life to UI.
What we want to use these for is to create
a new unified experience for users.
We're surrounded by devices, and people experience our work
across all of these different platforms.
And for that reason we want to treat every device
as a window on the same content.
We want to tailor the form factor so that each one of them
has commonalities, but also can be unique.
Color, type, and spatial relationships
help tie them together.
In this example email app the sizes
are all related by color and structure,
but there's diversity in the overall presentation.
The larger ones use cards.
The line length is kept reasonable.
And the small ones end up being full
bleed so they can take advantage of the size the device.
In this file's app there's a drawer on the side,
but on desktop it becomes a persistent element.
On tablets it's a temporary overlay
so it stays out of the way, and on phones it's
a persistent UI that you drill into.
In this calendar example there's more variety between the views,
but, again, typography and color tie them together.
So they feel like they're a consistent experience.
Immersive imagery also plays a pretty big role.
This is something we've seen on mobile where people are doing
things full bleed that we actually
want to take back to desktop.
It looks great there as well.
And in particular, when things used to be sort of full bleed,
we now can use things like cards to keep that same sense,
even though they're now surrounded
by other kinds of content.
Beyond the platforms we also care
about working with different disciplines.
Interaction, motion, visual design, and engineering
are often not deeply associated.
And so we've been using this material metaphor
as a way to bind the different disciplines together and make
them more collaborative.
In interaction the materials reinforce the overall sense
of hierarchy.
The scrolling and layering give a good sense
of how gestures work, and emphasize
how the user should focus their attention.
Visual design becomes simpler because of this.
The content itself can be very graphic in its hierarchy,
and rely on dimensionality for higher level
structures like toolbars or other elements
so they're not considered together.
And motion is in most ways the most important.
Materiality provides the grounding for it.
It makes it consistent and intuitive.
So it's obeying realistic physics,
and speaking better to the user for that.
More importantly, it allows motion
to be deeply tied into interaction and visual design.
We've got sessions tomorrow.
And we'll talk a lot more and more
about the interaction between these different elements,
starting in the morning with interaction design,
and then in the afternoon visual and motion design.
If you're interested in learning more now,
you can take a look at the initial spec preview
that we've put up.
There's probably more than you're
interested in seeing in the moment.
But you should come by and listen to the talks.
And we'll point out the most important parts.
And stay in touch with us.
We put up a new site for google.com/design,
as well as a Plus community.
So be sure and follow us there.
We created these principles as a tool for all your future work.
We want to inspire and evolve your designs and your apps.
So in addition to these sessions we'll
have a number of design sandbox events.
You should come by and talk to us.
But thank you for joining us.
JON WILEY: So we've left time for questions.
And there's actually microphones in the center aisle here.
There's two.
There's one in the back, and there's one up here.
So if you all have any questions about this
or design at Google, or--
Nobody has any questions.
Everyone just wants to play with Cardboard.
By the way, if you've tried the Google Earth on the Cardboard
thing, it's just amazing.
Here's a question.
AUDIENCE: I apologize.
I'm a developer, not a designer.
So this is a silly question.
I like design though.
I see a lot of circles.
What's up with the circles?
I like them.
I like them.
Just could you speak about where they're appropriate,
what you see them conveying to the users, stuff like that?
MATIAS DUARTE: Who's going to take the circle question?
JON WILEY: Matias takes the circle question.
I'll take the circle question.
I really would have thought the circle question should
have belonged to the art directors.
Well, there's a couple different ways that we're using circles.
Actually it's probably good to step back and talk about one
of the ways that we've simplified,
and we've tried to keep very low level, and elemental,
and primal what we're doing in material design
is everything is really its most simple and basic
geometric shape.
So you'll see circles.
You'll see squares.
You'll see rectangles.
You'll see basically divisions of space.
So when you want to have a contained element the simplest
way to do that is to bring in the circle.
So we used a circle because it's naturally has
contrast to a space that you've divided up
and that has blocks of text, or areas
that have been created by cards, or divisions color.
The circle is a great way to draw your eye without motion
to those elements that you want to emphasize,
whether that is the floating action buttons that
are indicating primary actions, or it's
the avatars of people that are very important.
Circles create rhythms themselves
that help you organize and scan through the page,
like when we have the multiple messages in email.
So what you should think of the circle as is it
is a geometric element that is a visual design
tool like any of the other tools.
It perhaps, in a very simplified shape pallet, stands out.
And that is it's attribute, that it does stand out.
And you want to use it in places where you want to stand out
or you want to create rhythms by repeating it.
Did you guys want to add anything to that?
JON WILEY: Yeah, I was thinking about sort
of where a lot of the circles kind of came
from, and in the early days when we were thinking about this.
So at the beginning I talked a little bit about, you know,
its material principles and the underlying physics.
And one of the things that happens with any interface
if you're interacting with it is you're injecting energy.
There's events that are happening where you're either
interacting with it, you're touching the interface,
and as you're using it you're injecting energy
into the process so things that are happening.
And one of the most fundamental things
that happens within physics is that whenever a signal happens,
whether it's sound, or light, or what have you,
it propagates in a circle.
It propagates from its point of origin,
at ideally at the same velocity outwards, right?
And that's generally in a sphere in a constrained depth
environment, that's going to be in a circle.
And so when you see not only the circles
in terms of the affordance for interacting,
but also as you tap on things and that circle radiates,
it's really about just conveying the sense of the physicality
of the energy going into this system,
and that that's actually your actions
are spreading forth into communicating
with the rest of the things that are on the screen.
Next question.
My question is regarding the form factor.
It seems like it's apps now with the material you'll
be able to design one app, and basically the app will adapt,
it'll be responsive to the screen size
that the app is running on?
Is this where this is going, where with the material
you'll be able to create one code base
and be able to run the app on any screen size?
MATIAS DUARTE: Want to talk to that one Nicholas, or should I?
There's two sides of this.
One is we want to make it as easy as possible to not just
re-flow to different sizes, but also to tailor the way that we
re-flow in a unique way.
So while the default platform behavior
will do the right thing as far as allowing
you to expand things to different sizes,
we do want more thought and attention
placed on how it should actually accomplish that.
Beyond that, a lot of these design guidelines
are intended to make that much more seamless of a transition.
Like the commonality of iconography and typography,
those, blend them together, even as we
introduce more differences.
But our prime focus right now is trying
to get these things to carry a design
across those different form factors.
MATIAS DUARTE: Yeah, I mean, I'll
be a little more explicit as well.
I mean, for starters, we do have two code bases,
one if you're developing native apps for Android,
and one if you're developing with Polymer for the web.
There are framework components in both apps which
will make it easier for you to scale between different screen
sizes, but there is a lot of intentionality and design
thought that needs to go into making some of those decisions.
So we don't have one automatic system
that will take any application and automatically scale it
for you to any screen size.
Maybe someday.
But I think that would actually require intelligence that's
pretty close to that of a developer and a designer
What we do have here is a design system
where you can create a coherent app and use the same design,
and have very straightforward ways to adapt it
if you understand also what the purpose of the app is,
and what is appropriate for the different screen sizes.
And that still requires a human.
AUDIENCE: First off, you guys rock.
JON WILEY: Thank you.
AUDIENCE: So I'm a big fan of the animations,
and giving cues to the user and everything,
but sometimes it can be too much, right?
So at what point do you say, if I
hit the play button it's going to go around the screen three
times, come back in, and pop up--
JON WILEY: That happens on every press every single time.
AUDIENCE: So sometimes users can see that as a bug,
or it might slow them down to whatever actions they're doing.
So were there any concerns around that?
And if so, how do you tackle that challenge?
CHRISTIAN ROBERTSON: Yeah, so I think one of the things
to know about-- This isn't just true of motion design.
It's true of design generally, that there's
a lot that's going on below what people notice.
So we don't actually want people all the time
thinking about the animations when
they're going from one state to another.
We don't want them to say, gee whiz,
I just opened up that email, and again I saw that animation.
We want them thinking about the email.
So if you apply that standard is how much do
I want people thinking about the animation themselves,
what is it trying to communicate, then
you can kind of back it down to the right level.
NICHOLAS JITKOF: One of the things
that we considered consistently throughout this process
was how to use the animation to sort of go along
with user's intention.
Because when you have touch responses that sort of emanate
outwards it's very easy to hide details in there that reinforce
what's happening without feeling like you've
added a bunch of animation in there.
So we're going to go into this a bit more tomorrow,
but having things sort of move counter
to the direction the user's doing things,
draw attention to them.
And if you're going to do so, you
should do that intentionally.
If you don't want intention drawn to things,
there's places to hide them.
And it's really about trying to figure out
how you want to balance it, and where
you want to draw people's attention.
So I was wondering if you could extrapolate
on how you see new forms of input with respect to motion.
So in regards to wearable devices,
let's say a swipe left and a swipe right with a wrist watch,
or drawing a circle in the air, how
would that integrate with user experience on an app level
let's say, or on your desktop?
And how that would sort of integrate?
JON WILEY: We've been given a design challenge on the fly.
How to design gestural interfaces.
I think a lot of this goes back to the material as a metaphor.
And part of what we're trying to do is-- You know,
you go and you watch some summer blockbuster sci-fi movie.
And sometimes it can be a little bit of a let down
because like they set up the world in which they're
operating, but then they break some principle, right?
And it just kind of falls flat.
And why did that happen?
And I think part of what we're trying to do with a system
like this is to set up a series of sort of a world in which it
functions, and ideally it's grounded enough in the reality
that we have here such that you can bring your intuition of how
objects function to the system, and it
fulfills those expectations, and hopefully exceeds them,
but then maybe we add in additional magical things
we can't quite do yet with physical devices
because we're rendering with virtual devices.
The bridging of gesture, and other types of interfaces to me
is actually just another level, it's
another additional dimensionality
in terms of interaction.
And it's a progression.
So when we started with computers
we basically had keyboard.
And then eventually we got a mouse.
And so that was like a little bit
of interaction, slightly removed.
You know, here it's happening up here,
and I'm using these things to control it.
Then we get all the way now to today
where we have smartphones with touch screen displays.
And now we have this additional dimensionality
of being able to physically touch the software.
And now we're adding this other layer in, hopefully over time,
and hopefully do it right, where we
have wearable devices that have gyros and stuff.
I mean, even the cardboard device actually
is really fascinating because-- Again, it's really awesome.
You play with it.
But it's just like doing, you know,
basic head tracking and things like that.
And so as we continue to sort of add
in the dimensionality of interaction with gestures
it just makes it so much more rich.
And so we just want to make sure that we're always grounding it
into sort of these principles that we set up initially.
So it's always self consistent.
You don't get to a moment where you're like, oh,
I did this thing, and that didn't make sense.
It was discontinuous with all the principles
that had been set up before it.
One of things that I think is really interesting about this,
and we'll talk about this more in tomorrow's sessions,
is that there's actually a fair amount of spectrum.
There's a lot of different types of things
that can be expressed, both in terms of color, and content,
and the animation that gives you a pretty big pallet.
It still feels like it's part of a system,
but it gives you a very large pallet
so you can express lots of different things.
So I think with the addition of wearables
I look forward to seeing how people express their apps
and their applications through these additional different
interface forms in terms of manipulation.
NICHOLAS JITKOF: One of the other things
that came up as we were doing this,
first as we are looking at just desktop and mobile together,
was unifying the overall touch feedback through them.
So having hover states and tap states all
be resolved together.
And then we started to see more and more of these.
And treating it as energy was a very nice metaphor for us,
because we were able to look at on touch
these things are going to move outwards.
Voice, as you're speaking, you have a similar response to it.
As you use the D Pad on the TV the movement
of your focus through a UI will be very similar.
And one of the things we didn't really
touch on, but like tabbing between controls using
a keyboard can have a very similar feel.
It is your focus moving through space,
and then when you hit Enter on something,
that energy blast is very similar to any other mode
of input you might do.
AUDIENCE: Thank you guys.
Appreciate your time.
So I work for a company where we are both cross platform
and our mobile apps are not our primary money maker.
So when it comes to requesting time be allocated
to working on this it can be looked
as a little bit of indulgence.
What would you suggest as some maybe priorities
to focus on when looking at implementing
some of these things?
NICHOLAS JITKOF: For the most part
the framework should be supplying 90% of the animation
and the richness in the UI.
What we're asking people to do is actually
look at some of the new APIs for shared element
transitions between activities.
Like there are moments that make good places for polish,
and brand in your app.
So using the core elements, using the core iconography,
using the theming to make it feel like it's your product,
but then just polishing one or two
really critical things usually has the best outcome.
JON WILEY: Yeah, and Polymer framework is actually
going to be really great for all kinds of different platforms,
you know mobile platforms.
NICHOLAS JITKOF: Yeah, and part of the reason
that we've done this for ourselves, it's helpful for us
to be able to have, if we do a mobile web
app and a native app, to create a very similar structure there,
so we don't have different interaction designers thinking
about different ways of handling it.
They should be treated as pretty much the same app
with small adjustment.
JON WILEY: This is probably a good opportunity
to also mention that after this some of us
are going to be in the design sandbox on level two.
And as part of releasing our spec, and design,
and having all this announced, we really
want to talk to designers and developers
about kind of the problems they have in terms of what they're
trying to solve for their users, and so that we can understand
how material design and the principles
here can help support that, or maybe there's
some unique problems out there that
need a little bit of a different twist.
And we'd love to hear feedback about that.
AUDIENCE: Hey there.
I'm Pierre.
I'm a UX/UI designer.
And my question is regarding animations.
For one, as you all know, animations
are something that kind of make an app pop and stand out.
And recently we had a lot of trouble.
I work in design agencies and stuff.
So where we make an app, we create animations
either in After Effects, Origami,
or we just animate them manually.
And I was wondering if there's going
to be a tool from Google that will help us kind of animate
our design, because that's the hardest part of transferring it
to a developer, explaining to them
how it's going to interact.
And will there be an addition to the design guidelines
explaining what kind of default animations you should focus on?
Will there be like more of a sandbox the designers can
use so they can kind of carry their points
across to the developers?
Because I usually find that point where
you kind of transfer all of your work to the developers,
that's the toughest bit.
And if a tool were there, like a native tool that
would help us aid in that process that would be great.
But do you have any tips for that?
And is there anything planed on improving that?
CHRISTIAN ROBERTSON: So even before the tool,
one of the things that we've noticed
as we've been doing that same process in designing
things, and working with engineering to make them happen
is it having that shared metaphor,
and having engineering kind of understand
those same principles gets us a lot closer
from the very beginning that we kind of know how things should
move, and how things should react.
And I think that's a good place to start.
I don't know if anybody else want to comment.
NICHOLAS JITKOF: It's also helped us to-- We actually
define a number of different animation curves.
We're not using just sort of simple ease in and ease out.
But we got those sort of built in, and worked
with the engineers so they understood those.
And we tend to use them everywhere,
and we iterate on them as needed.
So tools aside, we found it just helpful
to stay in constant communication
with the engineers, and try to get as much of this stuff baked
in as logic into the system rather than like as things
we're handing over that are one-offs.
And what's your standard design process
for like explaining it to the developers?
It's just like a case of them sitting near you
and you kind of cover the entire animation process with them
How do you get the point across to them?
NICHOLAS JITKOF: It's fairly ad hoc.
Depends on the situation.
Sitting together is always the best way
to accomplish anything.
I was just wondering if you had a standardized process
for that.
All right.
MATIAS DUARTE: I do want to mention before you walk away
that that kind of a request or interest in tools or anything
like that-- We don't have any tools to announce today,
but that's the kind of feedback and pain point
that you guys are feeling that we'd
love to hear about more so we know how to focus our energy.
And if you guys have more of those kinds of questions
or requests come by the design sandbox.
JON WILEY: One follow up.
AUDIENCE: Where's the design box?
JON WILEY: It's the second floor.
It's over by the Distribute section.
MATIAS DUARTE: It's kind of like you look for the YouTube sign
and follow the YouTube sign.
It's by the YouTube booth.
It's a very clear design.
AUDIENCE: So my question is really about color.
And I see a lot of bright colors these days everywhere.
And even in the pictures there some of you
have a purple background and a red.
So very bright colors, and they're
all different combinations.
So how do you pick these bright color combinations
in context of material design so that it's actually
appealing to the user, and so it has meaning and is not
bright enough to be noisy or something.
JON WILEY: We test 41 shades.
I'm just kidding.
Too soon.
Too soon.
JONATHAN LEE: There's two approaches to color.
One is that we intentionally wanted
to take a very exuberant approach to applying color
to our applications.
We felt that, like I was kind of talking about in some
of the slides where you should really embrace your brand's
product colors, or your icon colors, and extend those things
all the way through all the series of screens
that someone's going to see.
And so we thought, why not push it further?
And on the reverse side I think there's also
some pretty extensive usability and contrast ratio studies
that we started to standardize around.
And I think that that's the intention
of the design guidelines is to kind of help
give guidance on being exuberant while still
being accessible and usable.
MATIAS DUARTE: Yeah, so to be super clear, if you
go into the spec you'll find a whole bunch of pallets,
and they have a lot of guidance on what kinds of colors
you should put together in order to get legible text on top
of them, and stuff like that.
But we are very much in love with
and excited with these modern, bold colors.
But the entire system is designed
to let different approaches and philosophies
to color feel just as much at home.
If you want your content to pop more and your UI elements
to be really muted and subdued, you know,
that's part of material design as well.
The idea is that it is a material like paper,
and you can print something that's very exuberant on it,
or you can print something that's much more muted.
We want to be a framework where any brand can fully
express itself and feel at home, and not
be overshadowed by the platform's conventions.
NICHOLAS JITKOF: Some of the examples you actually
can look at in the preview will be
like the Settings app is quite muted.
It's intended to be very utilitarian.
Calculator app is also muted, but it
uses a single pop of color to draw attention
to other functionality.
So it's, again, about management of where
you want attention and color that
makes an app really unique.
One other point too, that the pallet
library that we've been working on,
one of the reasons we were so excited about it was it
allows you to select colors that can go alongside imagery that
allow the image to feel like it is covering the entire surface,
rather than like an arbitrarily chosen color that
may contrast in some unusual way.
So you have the opportunity to have it extend the image
outwards into the system, or even to contrast to that image
if you want to draw attention to an action.
MATIAS DUARTE: And it looks like we only
have three minutes and four questions left.
So let's try to get through them super quick.
AUDIENCE: I'm sorry to take the time,
but actually I have two questions, and it's tough.
The crowd will vote on which one we answer.
I'm just joking.
Go ahead.
AUDIENCE: First of all, I heard Jonathan is working for UXA,
which is the Google design guidelines for the Google Apps.
So would you like to talk about that a little bit,
how are you working across different apps
to make the UX guidelines consistent across Google Apps,
and how you also make sure all the apps
will follow the guidelines when you did that.
That's the first question.
And the second question was Google always
have a big push on the left navigations.
As a company we didn't follow that rule,
and unfortunately what happened was right after our release
from main screen navigation to the left navigation
we did see a lot of drop for the user engagement.
And also we see a big trend in that for Facebook app,
and on Google+ they changed from left navigation into the main
screen navigation.
So do you guys like to share some insights of that change,
and how that will lead into a lot of companies that develop
social apps that could get insights from this?
Thank you.
NICHOLAS JITKOF: OK, I'll answer the second question first.
MATIAS DUARTE: Super quick second question.
NICHOLAS JITKOF: Left nav is appropriate for certain classes
of apps.
The places we've seen them most successful
is when it's something like Gmail,
or like a stream of content where like 90% of your time
is spent there, and the rest of the time you need sometimes
access to other labels, or things like that.
We'll go a lot more into this in interaction design tomorrow
morning, but left nav is only one
of a number of different top level constructions.
You can have tabs.
You can have the content area itself
act as overall navigation.
One of the things we are trying to do in the guidelines
is very specifically calling out the benefits
of each of those different ones.
Left nav itself isn't inherently bad,
but when it's used for the wrong type of application
it can focus the user in the wrong way.
JON WILEY: And I'll answer the question
how do we coordinate design across Google.
We all talk to each other all the time.
That's it.
We go up.
That's it.
MATIAS DUARTE: Lots of drinking.
JON WILEY: Just talking all the time.
Just talking.
All right, next question.
AUDIENCE: I was thinking about all the [INAUDIBLE]
and the physicality that you stressed out
in the paper metaphor.
And I was thinking about Facebook's Paper app.
And they go a step further giving away all buttons,
and making the motion the clue to the interaction,
like to close something.
And I wanted to know what you think of this.
If you think that this app would feel right
in the next generation of Android apps?
NICHOLAS JITKOF: We love using motion
as an affordance for things that can happen.
We need to be careful about being too reliant on it
because there are maybe people who can't perceive it,
but it is a wonderful way to give clues
as to gestures that can be taken,
and even just to simplify the way the UI looks through it.
So we're excited about the possibilities.
MATIAS DUARTE: And you can see in our Calculator
app we have a side panel of expanded options.
There's no button for it.
There's no little textured drag handle.
It's just a surface.
AUDIENCE: And you close it by sliding to it?
JON WILEY: All right.
We can take one more question.
Come on.
JON WILEY: Just one.
They're telling us to go.
AUDIENCE: Hi, I was wondering if these design principles also
apply to Google Glass?
JON WILEY: Will the design principles also
apply to Google Glass?
NICHOLAS JITKOF: As of right now our design principles
don't cover Google Glass.
The information about color, and brand, and iconography
definitely do.
And we're working very closely with that team.
Our primary focus for now has been sort of the watch
to television form factors, but we're definitely
considering it and working to tie them together.
JON WILEY: Awesome.
Thank you everybody for attending.
Enjoy the rest of your afternoon.
    You must  Log in  to get the function.
Tip: Click on the article or the word in the subtitle to get translation quickly!


Google I/O 2014 - Material design principles

2253 Folder Collection
Qianhui Rao published on November 6, 2015
More Recommended Videos
  1. 1. Search word

    Select word on the caption to look it up in the dictionary!

  2. 2. Repeat single sentence

    Repeat the same sentence to enhance listening ability

  3. 3. Shortcut


  4. 4. Close caption

    Close the English caption

  5. 5. Embed

    Embed the video to your blog

  6. 6. Unfold

    Hide right panel

  1. Listening Quiz

    Listening Quiz!

  1. Click to open your notebook

  1. UrbanDictionary 俚語字典整合查詢。一般字典查詢不到你滿意的解譯,不妨使用「俚語字典」,或許會讓你有滿意的答案喔