Everyone who has ever worked on MacOS has felt this pain at least once.
You select a couple of files, right click, open the menu, and tap Get Info…
The Up Up Down Down podcast has a good episode with Asher Vollmer.
Asher Vollmer, creator of Threes and Close Castles, joins us to talk puzzle games. We discuss the process of designing developing Threes, and what goes in to making a great little puzzle game. We then talk about the whimsical minimalism that works so well in indie games, and how to bring them to life. We talk about difficulty, complexity, and how to tune games. We touch on the effects of free to play on puzzle games, and how it makes us feel weird.
There're some great points in there. It focuses a lot on Threes' gameplay and evolutions and dives a bit more into minimalism in games.
I'm self-employed and I work every day at home for my company, Pixelnest Studio.
A common good practice recommended by almost any home-worker is to take a quick (3 minutes) walking break every 45 minutes or so.
- It makes you move your body. Especially your legs and your back.
- It gets you out of your computer.
- It rests your eyes. Try to fix an object at 15m and blink your eyes a few times. It's a good way to not only use your close-sight during work.
It's a simple habit, but it's hard to get used to it. For the last weeks, I forced myself to walk for 3 minutes every 1 hour of work.
And… it's great. I'm really feeling much more concentrated and ready.
Because I will forget to stand up as soon as I'm working, I've made a little "app" for OSX. In fact, it's just a simple automator app that will trigger a notification for three minutes. Follow the gist above to know how to use it. It's dead simple, but it works.
NB: this advice is especially true for self-employed people (because you can't take a break with your coworkers), but anybody working at a desk for several hours should do it. It costs nothing, and your body will love it.
I've finally watched the Material Design segment of the Google I/O 2014 keynote.
Material Design is the new Google's design language. It looks promising and great. There are some nice ideas, akin to iOS 7, like depth (automatically integrated by a z-axis with live shadows rendering for developers) or transitions and animations as first-class citizens.
Bonus? A comprehensive set of guidelines with a lot of examples.
Unfortunately, there is some bullshit lines lurking inside the documentation…
Create a visual language that synthesizes classic principles of good design with the innovation and possibility of technology and science.
A material metaphor is the unifying theory of a rationalized space and a system of motion. The material is grounded in tactile reality, inspired by the study of paper and ink, yet technologically advanced and open to imagination and magic.
Please, oh please, don't let marketing people write technical papers.
I'm really eager to see how it will be implemented in Android & the Google web apps (if I understood well, the current Android L preview is not completely converted to Material Design yet).
There's still a lot of things that I don't like about Android and there's still a lot of rough edges (the awful black navigation bar, the heavy use of small typography, weird transitions & cie), but it's a great step forward for the platform. And there's even a decent status bar now!
The real problem for Google is to force developers to adapt to the new style. It's the fourth Android redesign (if I recount well) and the majority of the Android apps are still looking bad. I hope that this time, developers will rejoice and follow Google's lead.
One of the features that I wanted the most in iOS 8 was simple: Add a "Use Last Photo Taken" button in the Messages app (along with the current "Take Photo or Video" and "Choose Existing").
The idea was originally thought by Neven Mrgan and later adopted by a lot of apps (Tweetbot, Slack, Droplr, etc.).
An example with Tweetbot 3:
But it was still missing in the one place where it would definitely shine: Messages.
The way it was added is brilliant. It's so simple that I don't know why it hasn't been done already (it might, I just don't know of an app that have done that).
Here it is:
It's not a button. It's a grid of three (or more) images.
It solves three problems:
- You can use the last photo. 80% of the time, it's what you (and I) want to do.
You can see the last photo and be sure of what you are doing BEFORE doing it.
With a button, the app had to provide a way to check the action.
In Tweetbot, for example, the compose view shows a small thumbnail of the image.
In Droplr, it's too late. As soon as you tap the button, the photo is uploaded and you have to delete it to fix your error.
You can use the last few photos too. Which might represent 80% of the last remaining use cases.
It's so simple and clever that once you have seen it, you can only think: "Of course.".
The game is short, reasonably priced (4$) and stunning. Sounds are just perfect.
I'm pretty sure that this game is going to generalize low-poly graphics in indie games.
Initially, Quartz Composer is a “visual programming” environment to create animated scenes.
But recently, some designers have started to use it to mockup animations and interactions.
Because the primary goal of Quartz Composer is to create motion graphics, it makes sense to use it for interaction prototyping. The problem is that there isn't a lot of patches (a patch is like an instance of a class in a classic object-oriented programming language) dedicated to user interface in Quartz Composer.
That's where Origami steps in:
Origami provides a set of tools for Quartz Composer that make interaction prototyping a lot easier.
Animations, transitions, buttons, layers, phones, etc.
I recommend to watch this video to get a brief overview of what is possible with Origami. It's intriguing, it looks promising and it motivates me to learn Quartz Composer more seriously.
Alexis Madrigal on Netflix classification (emphasis mine):
[…], we discovered that Netflix possesses not several hundred genres, or even several thousand, but 76,897 unique ways to describe types of movies.
Using large teams of people specially trained to watch movies, Netflix deconstructed Hollywood. They paid people to watch films and tag them with all kinds of metadata. This process is so sophisticated and precise that taggers receive a 36-page training document that teaches them how to rate movies on their sexually suggestive content, goriness, romance levels, and even narrative elements like plot conclusiveness.
The Netflix Quantum Theory doc spelled out ways of tagging movie endings, the "social acceptability" of lead characters, and dozens of other facets of a movie. Many values are "scalar," that is to say, they go from 1 to 5. So, every movie gets a romance rating, not just the ones labeled "romantic" in the personalized genres. Every movie's ending is rated from happy to sad, passing through ambiguous. Every plot is tagged. Lead characters' jobs are tagged. Movie locations are tagged. Everything. Everyone.
That's the data at the base of the pyramid. It is the basis for creating all the altgenres that I scraped. Netflix's engineers took the microtags and created a syntax for the genres, much of which we were able to reproduce in our generator.
This article completely blew my mind. Great journalism.
The idea to tag any movie that much and create ultra-specific subgenres is so clever. Hell of a work too.
Concerning the Perry Mason mystery, I found an interesting explanation in the comments (yep):
Actually, Alexis, I am baffled by your conclusion that the Perry Mason Mystery is a "ghost in the machine".
To me it seems obvious: there is a subgenre, "Understated Cerebral Mysteries with Ironclad Plots, Good Dialogue, Not Much Action or Romance, and on the Side of the Defense", that is popular with viewers yet drastically underpopulated. It's so drastically underpopulated that the show that's a best-fit for the category is enormously popular, much more popular than anyone realized.
Far from being a "bug", this is programming platinum for Netflix. If they're as smart as I think they are, this is the subgenre where they should be looking to make a TV series. The biggest problem will be finding a showrunner and scriptwriters who are able to go against so many of Hollywood's cliches and assumptions. They need to make something where what is visually interesting, striking, or exciting is unimportant, but where there are no holes in the plots. Very high degree of difficulty, and only profitable to Netflix, which makes money from its shows, not from the advertising.
Sounds a bit like Sherlock, no?
During the last weeks, we (Damien and I) have cooked a complete tutorial about Unity and its new 2D tools. This tutorial covers the creation of a very small game from start to finish.
We hope you will enjoy it.
It looks (and sounds) gorgeous. And it's made by Capy, which is arguably one of the best video game studios out there.
Better late than never; I've managed to watch How Designers Destroyed The World by Mike Monteiro.
It's a great talk except that I'm not fond of the tone. Though it fits well with the content, I find it a bit overdramatic. The end is also a tad lengthy, which is unfortunate.
Otherwise, I liked what I heard. Which makes me consider…
Regularly, I find myself stuck with incredibly ill-conceived things that frustrate me. These objects, rules or designs are things as simple as an ATM, a software or a door. They make us loose time, or worse, do (severe) mistakes.
For example: there is a crossroad near my parents house. A few years ago, the mayor decided to remove the road signs at this place AND to not replace it. In France, there is an implicit rule which states that when there are no signs, you have to let the cars at your right cross the junction first. This rule is stupid. When you drive, you have no time to doubt. A sign serves primarily as way to convey a danger or a rule, without making you think. You know what you have to do instantly. If you see a stop sign, you stop. That is all.
This implicit rule makes us do the contrary. At a junction like the one I talked about before, you see nothing, so you have to think, analyze and react. It's easy to have a (minor) car accident there. Moreover, people of my town had used this junction for years, knowing precisely how to react. But now, the priority rule has changed, and it's a mess. The result? In the best case, everybody stop, wait for 10 seconds to analyze the place and then drive.
This is a result of a design decision. Someone decided to do that. In this case, someone decided to break something that worked well for all this time.
Sometimes, it might just be a dumb design that put you in an embarrassing situation. If you are a man, just remember the last time you have used a flat and too-high urinal. :)
When I talk about that with some of my friends, they think that I'm excessively demanding and that the problem is me. Perhaps.
I will conclude with a quote from the talk:
There are professions more harmful than (industrial) design, but only a very few of them.
— Victor Papanek (modified by Mike Monteiro)
Jeff Atwood in Good Programmers Get Off Their Butts:
I am not proposing a code-like-hell methodology. I am merely observing that, in my experience, coding without planning is just as futile as coding with too much planning. Software development is a wicked problem; you should never make planning decisions without some kind of code prototype to ensure that you're making informed decisions. If you plan too far ahead of the code, I guarantee you are doing work that will be thrown away or altered until it is unrecognizable.
The most destructive symptom of over-planning is the wrongheaded idea that being a Software Architect(tm) means drawing a lot of UML diagrams and handing them off to a group of developers in Bangalore. UML is great if you don't want to do any work; you just draw pictures of what it would look like if work was actually done. This is not only borderline laziness, it's also a recipe for disaster. You can't architect a real world application on a whiteboard. You must prototype it in code to gather data on the performance and design implications of the choices you are making.
9to5Mac (emphasis mine):
For pre-iPhone 5s devices, ARGUS usually drains about 20-30% of battery life per day while running in the background. The app continually checks for motion and calculates the distance and number of steps using the accelerometer and GPS. With the M7, however, the app does not need to be running in order to keep track of the number of steps – all of that data is tracked by the M7. As such, ARGUS no longer takes up any battery power while running in the background and the stated battery life from Apple – 10 hours 3G talk time, 250 hours of standby – will stay exactly the same. […]
The M7 APIs allow developers to query information about the user’s current transportation status (whether they’re in a car, walking, not moving, etc.).
If this is true, it's impressive.
I hope that the use of the M7 will expand outside of the fitness and health apps.
Florian Kugler about mobile apps in Worth Less than a Cup of Coffee:
[People] might say otherwise when asked about, but their actions speak pretty clearly: A cup of coffee is worth more than almost every app on the store.
That's a hard pill to swallow, but we should let it sink in. We pour all our creativity, time, and passion into creating basically worthless products.
Recent events and the backlash against Realmac with Clear for iOS 7 are sad.
People don't understand that creating softwares — truly good softwares that sweat every detail — takes a lot of time and money. Ironically, even a lot of developers who work in big companies don't want to pay for softwares.
The mobile apps markets have made this situation worse.
I'm currently thinking back about the mobile market, especially for games. At a time, I thought that going mobile-only was the best shot today, but the more I think about it, the more I'm realizing that a platform like Steam (which also drives the prices to the bottom, alas) is more sustainable for a developer/company.
Marco Arment said in its latest blog post: "Paid-up-front iOS apps had a great run, but it’s over. Time to make other plans.".
So what does that leave? Freemium? Unfortunately, freemium almost always goes with bad apps or games that trick users to pay for more.
N.B.: To be clear, as a user, I'm happy to pay less. For almost everybody, a software or a game at 50-100$ is way too high, and that's fine. What I don't like is that we are now in a situation where even one buck for an app you use daily is too much. One buck per app cannot sustain a one-person business.
"There's a lot of outrageous stories, but everyone's so damn afraid of coming forward—It's like going against the Mafia," he said. But the idea that trolls may retaliate against those who speak out is overblown, he thinks. "If they want to try to teach me a lesson, go for it. This will be my retirement. I'll fight them."
Here it is:
Everyone knows that patent trolls are useless companies that only try to steal money from defenseless ones.
Everyone knows that they don't invent anything.
Everyone knows that they cost a lot — of time and money — with endless legal procedures.
Everyone knows that it's pure bullshit and extortion, and yet, it is perfectly legal.
N.B.: I'm not against patents and inventors' rights. But there is a big difference between protecting your properties from real thieves and suing everyone without actually creating anything.
However, if the only solution to this problem is to abolish the patents system, so be it.
The Most Forward Thinking Apple Yet (emphasis mine):
In many ways, WWDC, and more so, the iPhone 5C and 5S represent (and had to represent) Apple's reply to the naysayers. However, I think there was more to the announcements than what met the eye.
Read the whole thing.
Everything that Apple announced last week has no real purpose currently — or only a fraction of what will be possible with one day.
For example, the fingerprint sensor is a really neat idea, but we will be able to do a lot more with it in the future. The use of the M7 motion co-processor or the 64-bits iOS are even less tangible. Unless you are Apple and you know what will be released next.
They are calmly preparing the ground for the shape of things to come. It's thrilling.
N.B.: The Bluetooth LE technology that was embedded a few years ago was exactly that: at that time, it was useless, but in the context of a wearable Apple watch, it makes a lot of sense.
Hell, it's about time.
→ Get here.
This post is not about the technical quality of the logo. I am not writing about brand design, but about brand management. This is about a simple rule: Brand design follows brand management, not the other way around.
One could argue that we can’t say if this is bad brand management. Unless we know what the brand ambition, the brand architecture, and the brand strategy is, we ought to have no opinion. Maybe the logo does exactly what it is supposed to do. Because really, it doesn’t matter whether it looks pretty, or whether someone likes or dislikes the purple or the scallops. The Coca-Cola logo was not designed by a professional designer, it is typographically hideous, but that doesn’t matter. Brand identity is not about visual refinement or aesthetics. It can be purposely ugly, like the London Olympics logo. What is important is that it is done seriously.
Personally, I don't care about the yahoo new logo. Anything would be better off that the old one.
By the way, I'm still amazed by the quality of the iA website, and especially the blog. Typography, details, everything. It's wonderful.
Microsoft did not make its mark as a builder of great things, but as a very successful bundler of good-enough things.
Eventually by the mid-90s an argument could be made that Microsoft was making the best PC software in its class. Not because it had suddenly found the ability to develop cool, innovative products, but because everyone else was dead.
Still, outside that monopoly, there is a clear distinction between Microsoft products that can trace their market dominance back to DOS, and those that can't. The former make money, the latter lose it.
I jumped on the leaked prototype, with the same general sentiment: Samsung’s watch will undoubtedly change drastically whenever Apple’s wearable is released. And, to be certain, we will mock them for copying.
We, especially in the West, have a powerful sense of justice and fairness when it comes to product features and being first. Business, though, is not fair, even if it is more just than we care to admit.
The Samsung Galaxy Gear is horrid. I was tempted to write an article about watches and Samsung last week, but Ben Thompson did it way better today.
- Samsung rushes to create a ridiculous watch in order to be the first.
- Someone else (probably Apple) will release a great watch.
- Samsung will copy it.
But hey, it's business, so it does not really matter if it's fair.
N.B.: The only thing that is against Samsung in this case is that watches are fashion accessories.
The only hardware maker that has a sense of fashion is Apple (the iPhone is almost a “jewel” in its form after all). A watch needs to be elegant and refined. Samsung is probably incapable of doing that given its history of tasteless products.
N.B. 2: The fun fact with watches is that ordinary, they are quite expensive. $300-400 for a “smart” watch that looks like a quality watch won't be that much for the targeted market.
Update on N.B. 2:
My watch is gold too. Thieves: grab that instead. You can buy a good few iPhones with the proceeds.— Matt Gemmell (@mattgemmell) October 1, 2013