← All episodes
Metamuse Episode 55 — April 28, 2022

Mac app design

Pro apps on macOS have a look and feel unlike apps on any other platform. Julia and Lennart join Adam to get into the details of designing and implementing Muse for Mac. Topics include the pros and cons of building with Catalyst, how the Muse canvas mixes with system conventions and UI chrome, and our experimental approach to developing the keyboard+mouse command vocabulary. Plus: how Julia rediscovered her love of right-click context menus.

Episode notes

Transcript

00:00:00 - Speaker 1: The iPad is the perfect device of being able to immerse yourself and just being able to explore versus the Mac is all about getting things done and about speed and efficiency. If we embrace that, we naturally end up with apps that are quite different.

00:00:24 - Speaker 2: Hello and welcome to Meta Muse. Muse is a tool for thought on iPad and Mac, but this podcast isn’t about Muse the product. It’s about the company and the small team behind it. I’m Adam Wiggins here today with my colleagues Julia Rogats.

00:00:38 - Speaker 3: Hi, Adam, nice to be back.

00:00:40 - Speaker 2: And Leonard Sursky. Hi. And Yuli, I know you often spend winters traveling in sunnier places, and you recently returned from Panama. How was the experience of working remotely and during sort of travel holiday activities this time around?

00:00:56 - Speaker 3: Yeah, it was fantastic as always. And in this case, on my winter trip, I actually got to indulge even more in the traveling part since I switched, I think about half a year ago to working only 4 days a week. So I had 3 days off at a time. Sometimes I took an extra day here and there, so I had a lot of time to travel around.

Discover the country and then yeah, spend a few days a week working on news actually often interweaving this with long distance travel.

I’m often stuck in, you know, 6 to 8 hour cross country bus rides and those actually ended up being perfect opportunities to just have a deep focus day and kill a lot of time doing that.

00:01:39 - Speaker 2: We could usually tell when you reconnected to the internet because a whole bunch of commit messages and like pull request, things would kind of stream into the Slack channel simultaneously.

00:01:51 - Speaker 3: Yeah, that’s right. I do have the advantage that a lot of the stuff that I’ve been working on actually I can work on offline, so we were kind of in a phase where We didn’t have to do a lot of design decision discussions. It was more fixing bugs, implementing little features, so I would just make sure that I had at least, you know, 10 of those small things queued up for the trip and then would just work through those while offline, then connect back online and yeah, have a nice new update for everyone that was waiting for it.

00:02:20 - Speaker 1: Yeah, and I’m quite impressed by how you’re able to do that. Julia, both combining sort of vacation and work. Like whenever I’ve tried to do that, I both got nothing done and had a bad time, basically. And so I think it’s especially impressive with the launch we are working on.

00:02:36 - Speaker 2: Well, it certainly seems like a skill you’ve developed Julia, which is the ability to be really focused when you need it and then switch out of that and go fully into OK.

Now in this interesting place, I want to explore have adventures be fully present in my environment for a day. A few days, whatever it is, and when you have that work block, whether it’s on the bus or just days you set aside sitting in the hotel or whatever. So I think a lot of that is, at least it seems to me like a skill you’ve built up over quite a lot of years because this is just the lifestyle that you want to lead and so you’ve spent the time to create your mental discipline around that.

00:03:14 - Speaker 3: Yeah, it surely does require some discipline, I would say, but it’s, yeah, it’s something that I’ve cultivated over the years and being able to. Shut out work when it’s not time to work and really enjoy life is something that’s really important to me and then that’s actually where I draw the energy to then be going back to the computer and be productive.

00:03:35 - Speaker 2: So our topic today is the design and implementation of our Mac app, that’s Muse for Mac, and those following our story now we’ve had the Muse 2.0 product has been in beta for a little while. We’re coming up on a launch, and one of the key features, probably the most notable feature for users and customers, is the Mac app.

And I thought it would be really great to get both of you on here to talk about this while it’s still fresh in your minds, because I think this really is, while other folks on the team maybe have been deeper on things like the sync engine, for example, you two have been really the mind melded dynamic. The duo that is making the Mac app come to life and I’m more of a user, an avid user of it, but I also get to follow along all your design discussions in the Slack channel. I find it just really fascinating and I hope we can dive into a bunch of those things today.

Yeah. And maybe we could start with kind of an overall design approach. So obviously Muse 1. X was an iPad only app and we have a lot of unique concepts in there, this open canvas, the nested boards, the different types of media cards and Then when we’re thinking, OK, we want to add this additional platform because we know the desktop is such an important place for doing work, but we don’t necessarily just want to do what Mark would call the transliteration just porting it straight across without a lot of thought because the desktop is not only a different. Set of hardware, but it’s actually I think you’re in a different almost mindset when you’re sitting at your desk at your keyboard in this focused posture. You probably have a different almost approach to your work than you do when you’re leaning back, for example, on a sofa or reading chair with your iPad and pencil. So I’d be curious to start off with kind of what’s the overall design approach? How do we think about bringing new to this new platform?

00:05:27 - Speaker 1: Yeah, I think it’s all about the balance.

So on one hand, we have this iPad app already. Um, we have news on the iPad, and we have a lot of users for it and we sort of have established a certain design there. We have made a lot of design decisions and now we are bringing that to a completely new platform that, you know, in some ways is connected to the Mac, but it still has its own set of sort of rules and conventions and also its own set of users that have different expectations than iPad users. And so we are trying to balance building really this native Mac app from the ground up, like what, what would it look like if Muse on iPad didn’t exist and now we are building Muse just for the Mac. With sort of the existing iPad app and trying to make it into one coherent model.

00:06:15 - Speaker 2: And I think that’s a really unique piece of our approach. We’ve talked a bit before, perhaps maybe on the native apps podcast episode, where we said that many and most apps kind of tend to have a home in one place like a platform like they originally.

Instagram was a phone app and then maybe there’s a web version, but it’s kind of a companion or an add-on or just an additional thing or similarly, maybe have a lot of Sass tools, notion might be a good example.

It’s really native to the web, it feels most natural and It’s the baseline to be in a web browser on a desktop computer, and so when they make an iOS app, it feels like a little bit of a bolt on, it maybe doesn’t follow a lot of the platform conventions you expect. It’s often lagging behind, you know, the features that are in kind of the main platform, and I think the The idea of something where we want to, like you said, bring it to this new platform, design it as if it was a new app there while also sharing a lot of primitives and concepts and of course actually the data because it syncs between them. So actually it has to share all that exact data set. I think that’s an unusual thing.

00:07:23 - Speaker 3: And I think also kind of sharing some of the core values of news. So one of the things that we had in mind designing the iPad app is that we wanted it to feel really fast and fluid.

And one of the things we did to achieve that is to have this quite sophisticated gesture system where you can use all of your fingers, you can move cards around, you flick them off the screen to delete them, and it is a bit of a learning curve there, but if you really do learn this design language, then you are able to work with a tool really quickly and efficiently.

And obviously we couldn’t just bring that 1 to 1 to the Mac because you can’t use all your 10 fingers on the Mac, at least not currently. But we still want users to be able to work quickly and efficiently with the app. So thinking about how to bring the app to a new platform, keeping this value of making it feel very fast and fluid, but using other tools such as really good cursor support, keyboard shortcuts. I’m sure we’re gonna talk about that in more detail, but yeah, those were some of the thoughts for sure.

00:08:25 - Speaker 1: And in a way, it’s taking something that’s traditionally seen as a weakness of native apps, like you have to build a new app for every platform, you know, that’s terrible. Let’s do a web app and you have one app, it runs on everything. But I think we’re sort of trying to take that and turn it into a strength by saying, OK, we have a chance to build an app for each platform and, you know, it can be different and we can leverage the sort of system strengths for each. And I think if we get it right like that can be a really big advantage for native apps and sort of what it needs to. You can compete with web apps.

00:08:58 - Speaker 2: Yeah, that is a lot of the appeal. It obviously is not a user benefit other than I suppose, being more universal or being available on more platforms, but it is a benefit to the company to have less code to maintain or just a smaller engineering team or less work to do to keep everything in sync in the sense of features, you know, if you have a native Android app and a native iOS app versus if they share a code base with I don’t know, React Native or something like that.

So notably here, you know, when we come to the technical side, we are indeed sharing a code base between the two with only a 5 person team and only 3 of those are actually engineers, you know, that’d be pretty tough to do 2 full complete apps with the team that size, but we are getting leverage from the shared code base, right?

00:09:42 - Speaker 3: Yeah, we’re definitely there, and I would say I’m even quite surprised by how much code we were able to use or reuse across both platforms.

You know, the very first time we flipped that switch, so maybe for some context, the app is built in Mac Catalyst, which is a framework that Apple released a few years ago.

That lets iOS apps be ported to the Mac and basically just clicking a checkbox, making it run, and then see what it’s like, and news was actually quite usable from the very first build, but then I guess it’s the classic 80/20 rule of, you know, most of it works really well, but to get to that really polished state that I hope people feel the Mac gap is in.

You’ll actually spend a long, long time in our case, several months polishing it and improving it, but most of the basic logic in the app is just the same between both apps and notably also the sync layer that we were working on in parallel that lets users use their data on both devices.

That’s all just shared across all platforms, and that means a lot less code to maintain, fewer bugs to fix.

If we want to introduce a new feature, we can do it on both platforms simultaneously.

Yeah, less testing to do. So it’s really a great advantage. And there’s, of course, a few downsides like you end up sprinkling your coat quite a bit with if I’m on the Mac, have this component look like this. If I’m on the iPad, should look a little bit differently, but it’s really quite manageable, at least at the moment.

00:11:16 - Speaker 1: And maybe Muse is even in a bit of a unique position there and that we have this what we call the Muse canvas, which is basically on the iPad, it’s most of the app, right? We don’t even have any visible UI Chrome by default, but you have the canvas where you place all your content and we don’t really want to mess with that at all.

Like we don’t want to move your content around or change how it looks on different platforms. So I think that was one of the first things we were able to more or less set in stone for bringing news to the Mac that, OK, the canvas is going to look the same. We just have to make sure it works and adapts to the system conventions and like that’s already 90% of the app logic, right? And so then we have all this UI Chrome around it where on the Mac, I think we actually have a bit more since, you know, you have the menu bar and you have sort of a bit more of an established system UI that every app needs to provide.

00:12:09 - Speaker 3: But at least as you said, these interface elements, the menu bar are actually perceived by the user of belonging more to the operating system around it, so not necessarily Chrome off the app itself. And on the Mac we were even able to remove some of the Chrome that we have on the iOS app where on the iPad, when you tap on the canvas and the action bar pops up that lets you do all kinds of actions. We just moved all of that into the menu bar or into keyboard shortcuts. So in a way, the Mac app is even more focused on your content now and doesn’t have any buttons that get in the way.

00:12:46 - Speaker 1: And for me, that was actually a really nice positive difference from designing for the iPad. That a lot of the time when on the iPad, we would have to build our own interaction or our own interface element for the user to see. On the Mac, there’s already an established standard for that. There’s maybe even already like a view that Apple built that you can plug into. And yeah, there were just a lot of times where basically we could have something on the Mac without having to rebuild it from the iPad just because Apple already provides that.

00:13:18 - Speaker 3: Yeah, definitely rediscovered my love for right click contexts menus. Yeah.

00:13:23 - Speaker 2: Same, yeah, I do a lot with the right click stuff on the Mac. Yeah, it’s interesting because you know we have such a, I don’t know if you call it quite an internal culture, but just a basically a pattern of needing to always reinvent everything in some way as we’re building for the iPad because there is so much less precedent for productivity software there.

And it was almost kind of relaxing to realize that, yeah, the Mac has such a rich and long history of great productivity software, and there’s obviously specific documentation like the Apple Hig, but there’s also just a lot of great apps that you can look at and basically say, oh, folks have already figured this out and it works great, we don’t need to be that original, we can just do. What others do, you know, adapt it to our situation and try to find the best possibility that fits with our, again with the open canvas and the cards and all that sort of thing, but we can draw from that rather than needing to always be inventing everything kind of from scratch.

To that point, I’d be curious to hear the apps that we kind of took for inspiration or reference. I know I saw very often, you know, referencing, for example, the behavior finder as a bit of a kind of benchmark for here’s the baseline of what all Mac users are going to expect in terms of how basic interactions work. Yeah, what are some examples that we drew from.

00:14:53 - Speaker 1: Yeah, there were a lot of them. And I think part of it is that Muse cannot really be put into one category where we can just look at the other apps in that category and see how they do it. So yeah, we looked at the finder because Muse has a lot of sort of file browser style parts to it as well. We also looked at Sketch or Figma, sort of some of classic design apps, which have really set many new UI conventions over the last few years, I think.

00:15:20 - Speaker 2: Well, and importantly, they also pioneered the infinite canvas stuff. I guess even going back to like Adobe Illustrator, for example, I mean, certainly some of my early inspiration for wanting to make a sort of open canvas ideation tool came from watching how creative people often use Illustrator, where they have this just big open space and you can zoom out and you have a lot of iterations of one idea up in this corner and some iterations over here. So there is, even though we’re not a design tool or an illustration tool. Some of the precedents has been set on the open canvas we do borrow quite a bit from.

00:15:55 - Speaker 3: One other I’ll mention, even though that’s maybe just a different flavor of the finder, is actually the Mac OS desktop. I think that’s one of the areas of the operating system that actually comes closest to the experience of use the canvas.

It’s just kind of an open space where you have items, I think depending on how you have it configured, you can arrange them freely, drag them around.

And I was looking to that specifically when implementing multi-cart selection behavior, this is something that is also new in Muse2 and in use on the Mac specifically. I was surprised to see how we actually have quite an intuitive understanding on how selections should behave based on how our operating systems behave. So whenever we implemented something in the multi-cut selection space that didn’t feel quite right, it was usually because it was different from how the Mac OS desktop does it. So that was definitely a good inspiration on figuring out what kind of behavior users expect and what just feels right.

00:16:59 - Speaker 1: Yeah, and it’s really fascinating how especially sort of this really core part of the desktop experience or even MacOS, you know, it’s been around for decades and it really hasn’t changed and at least for me like I basically grew up with it, right? I don’t know any other way that it could work. And so it’s very ingrained in how I think these interactions should work. And that’s quite different from the iPad where I kind of know a time before the iPad and I also know how Apple started and how it got there very iteratively over just the last few years and like it’s still changing things, it’s moving things around every year.

Even app developers haven’t really agreed on a single way to do it and probably even Apple itself hasn’t really agreed on one way to do certain things on the iPad. And so in that way, I think it’s really helpful to just look at what Apple is doing on Mac OS as sort of the gold standard, even though they are, I think, especially with sort of the catalyst apps that Apple has been doing over the last few years, like there are also a few outliers of apps that just aren’t as well designed or aren’t as well fitted to the Mac OS system. And so since Muse is a catalyst app, you also have to like specifically look at Catalyst apps. I think where to me craft stands out as one example of just a really well designed catalyst app, and I think we looked to them quite a lot for just seeing what’s possible on Catalyst and like what’s a reasonable solution in between the Mac and the iPad.

00:18:30 - Speaker 2: Yeah, definitely. I think we have to give a big shout out slash some appreciation to Kraft.

They’re very good sort of comparable to us because they did start on iPad and then came to Mac, and they have sync between those.

They were an early mover on Catalyst, and I think probably, you know, we benefited from starting, you know, a year or two later than they did, and I think they faced a lot of the bugs as that technology was still pretty early and they wrote a nice long guide that I think we referenced quite a bit.

And on top of that, their team even was nice enough to answer a few questions we had, which I think usually came in the form of we go to look at how the preferences panel is implemented or the toolbar or something, and we go, oh wait, Catalyst doesn’t seem to support that API that normally would have elsewhere and then basically we’re able to ask their team and they said, oh yeah, you know, here’s how we did it.

And that was just really, really nice to have uh referencing back to our career episode, maybe where the advice is always talk to your elders, people who have, you know, forged the path before. And so, yeah, very appreciative to their help along the way.

00:19:35 - Speaker 3: Yeah, definitely, I’ll second that and yeah, for sure, Kraft has been a big inspiration for us and kind of the benchmark on the kind of quality that you can actually deliver building a catalyst app.

And on many occasions they’ve helped me with some quote level support and even just to get confirmation on an issue that I got stuck with to figure out is this really not possible? Is this potentially a bug in the system.

There’s just unfortunately quite little documentation out there on Catalyst stuff and not so many people are using it. So if you do run into some weird unexpected behavior, it’s often kind of hard to find materials to solve that and Just talking to the craft team about that and hearing that, yeah, we got stuck on the exact same thing. It seems to not be possible, what was just a good check for me.

00:20:28 - Speaker 2: So we mentioned using Finder and also the, I guess Finder-ish thing that is your Mac desktop is kind of one of our main inspirations or references for selections, and you, you mentioned the multiselect is new in use too, which was sort of an interesting fallout. We weren’t, I think, necessarily thinking the 2.0 product is going to include more powerful selections, but it’s more that once you’re on Mac. Where you just expect selections to work a particular way, and yeah, you expect them to be more powerful.

That’s just the nature of the desktop platform and so that pretty naturally led to implementing much more there, and then a lot of that does indeed benefit the iPad.

So maybe the starting place there is how do selections actually work in Muse.

00:21:16 - Speaker 1: Yeah, so, so far on the iPad, it really just work with the pencil, right? So we rely on the pencil on the iPad a lot and we have the selection tool for the pencil and you can use it basically to draw with a lesser selection around your content and then you can move that.

But we don’t really visually indicate that much what exactly is selected and you don’t really have a way to really One by one select cards. And so on the Mac, the selection system kind of requires that.

So if you select something in the finder or even on the Mac desktop, you know, you can very quickly select something just by like opening a rectangle with your mouse and everything within that gets highlighted and selected. And even from that stage, you could like hold down command and deselect things again like called shift and deselect a lot of things, move those around and Yeah, it’s like a very. Well thought out and established system that kind of really works exactly this way across everything on the Mac. Ideally, And yes, I think we really needed to follow that from use as well in order for it to feel native in any way to the Mac.

00:22:24 - Speaker 3: Yeah, and then on the Mac, even taking it a step further and allowing you to just single click a single card and select it this way so you don’t even need to necessarily draw a selection with your cursor around it, but the default behavior that happens when you click on a card is that it gets selected and if you click again, it gets deselected. We actually did have to do. A few compromises here because for some card types, you just expect, for example, if it’s a text card, you expect that clicking into it will make you able to edit the text, and that is in fact the case. So there were a few places where we just had to make the decision to sacrifice consistence for user expectation, but I think we landed in a pretty good place there.

00:23:15 - Speaker 1: And I didn’t even realize that consciously before, but on the Mac, basically anything you click on is first selected, right? Like if you click on an email, it gets selected. If you click on something on your desktop, it gets selected.

And so while on the iPad selection is really about bulk editing, like you can press the edit button on the iPad and then you can select multiple things if you want to edit multiple things. On the Mac, it’s really just used for everything, even if you just want to interact with a single item, like if you want to delete something, you first select it and then press the delete key or you press enter to rename it or something like that. And it even gets automatically selected if you move an item from one place to another. By doing so, you also selected. And so it’s really such a core interaction on the Mac while being this like special case on the iPad. And I think that was just a really big shift for us as well even in how we think about it.

00:24:13 - Speaker 2: Seeing how the selections came together also to me really emphasized this difference in Yeah, use case and setting for each platform.

It’s that, you know, with the iPad, you’re doing relaxed thinking, you’re reading, maybe annotating brainstorming with Mac, you’re doing productivity, heavy research and in particularly this kind of idea of bulk editing, which came up a lot, I feel, and you see that with the selections, which is that you can, for example, just the simple thing of Hitting command A to select everything so I can move it around a little bit, or yeah, selecting a whole screen or half a screen full of stuff and then you can fine tune the selection with command, click, and then you know cut that and move to another board or drag it to another window or something like that. You can do all of that stuff on iPad, but it’s just you have the smaller screen. You don’t have the same kind of speed and precision. You don’t have the keyboard shortcuts. So while it’s more intimate and natural and organic to touch things and use your pencil and stuff, it’s just not fast to do those big bulk edits. I guess intuitively I knew before, OK, the Mac would be better for that sort of thing, but I wouldn’t have really been able to say why, and now I feel like the selections and powerful selections, it really is the core of that.

00:25:29 - Speaker 1: Yeah, for sure, and it’s in combination with keyboard shortcuts as well, right? Like you select something and then you want to do something to it quickly and with the Mac, you have sort of the guarantee that every user has a way to quickly select something and has a keyboard to quickly manipulate those objects. And I feel like that is sort of the matric combination that we have and that makes the Mac so powerful and the iPad is kind of missing both of those sadly.

And so just because of that, it kind of becomes a very different platform that can’t have speed and efficiency sort of as the target, but sort of has to have strength and the direct manipulation and sort of the immersiveness you get through touch input.

00:26:11 - Speaker 2: One area I think is important or I’ve observed.

In my own just use of uses and others as well as this concept of culling, what I usually call culling, which is you probably dump a bunch of source material that you think is going to be inspiration or reference or as a starting place for your thinking process into a board, but often in the process of kind of sorting through it and trying to find the patterns and figure out what you want to do with it, there’s this element of, oh, you know, this thing doesn’t fit in. I don’t need it or I don’t want it or it’s a duplicate of something else. And that’s why I think that gesture we have on iPad, which is to throw something off the edge of the screen. So 3 of the 4 edges of the screen, if you swipe something in that direction, it just disappears. I think that’s a really important gesture to be so easy to do and top level because of that quick culling.

You can just dump a bunch of junk in because it’s OK. You can throw out the stuff you don’t want. But it’s very good for the kind of one-offs, so I’m going to do this one, this one, this one, and that’s pretty quick. Whereas, yeah, on desktop, we’re just used to, you select a couple of things and you hit the delete key, and that’s a very quick way to do it. And of course there the throwing something off the edge of the window, I don’t think it would even really make sense. That’s just not an interaction that is at all pleasing to try to do with a mouse, nor does that really fit with the paradigm at all. Whereas, yes, there is something that’s really well established, selected and hit delete, and that’s very fast in its own way.

00:27:37 - Speaker 1: Yeah, there were some tough decisions there, I would say of, OK, which elements of the iPad app can we just take over and apply on the Mac as well, or which things do we actually have to cut or find a different way to do it on the Mac since we don’t want to alienate our users or like take things away from them, basically. But we also have to be very careful not to just take what’s on the iPad out of convenience basically and just say, OK, it’s gonna work the same way on the Mac, it’s gonna be fine. And yeah, so weighing these was always, I think a difficult choice even for like tiny details.

00:28:10 - Speaker 2: Another thing that points to the importance of selections is kind of where they sit in our gesture space.

I don’t know if that’s the right terminology when you talk about Mac, but that brings me to another section that I’d love to hear you both talk a little bit about, which is, I know you put a whole bunch of time into questions like, How do you navigate into a board? Is it a single click? Is it a double click, or what happens when you click on open board space? Does that kind of turn into a little grabby hand and let you pan the board, which actually is what it does on iPad. If you just put your finger down and open board space and then move your finger, you start to pan.

But I think here we ended up with when you put down your cursor and click in open board space and then drag, you start to get a rectangle selection.

So we’re saying the selection is so important we want. That single click kind of default to actually be that. So I’d love to hear about the process of kind of coming up with the holistic idea for what the Mac gesture space is.

00:29:08 - Speaker 1: Yeah, that was really something that especially in the early parts, we really had to just try out a lot of different things, like basically for every possible basic interaction, like whether it’s double click or single click to open something, to edit something, to make a selection, you know, we basically had a list of choices to pick and combine those. And yeah, in the end, it’s a lot of sort of trying and giving it to a few users and seeing how they react. And yeah, then you just have to Sort of consider all the different parts of, OK, so you wanna fit into sort of the Macro as well, but you also wanna fit into what you’ve established on the iPad. You wanna make sure it just feels good to use and feels in line with how we want you to feel basically. And then I think we actually also have to think a bit about, OK, what other things do we actually want to build on the Mac in the future? Like, what do we want the Mac app to become at some point because we don’t want to box ourselves in by setting certain interactions know that in the future, you know, means we can’t do something else.

00:30:11 - Speaker 3: Yeah, and I think in many cases it’s actually trying out these different versions, even though we were unsure in the beginning what the right answer is. Once you try out one versus the other, you just immediately know because one feels wrong and the other one feels right.

So I think the example of zooming into a board with a single click felt wrong. It felt borrowed from the iPad where you tap to zoom into something is just a very well established pattern. Whereas on the Mac, opening a document is almost always a double click. So putting these two side by side, it was just immediately clear that one was much more well suited for the Mac.

And yeah, when it comes to scrolling the canvas. The good thing is that you actually, if you use a MacBook that has the trackpad touch input. That actually works with the two finger scroll just like you used to from almost any other scroll view, your browser or whatever.

00:31:11 - Speaker 2: As well as the pinch gestures, so you can pinch the zoom in on a board, pinch the zoom out. Yeah, you have the two finger pan, so I was surprised how natural that stuff was, but then you also can’t assume they have a trackpad. They might have a mouse or a trackball or something like that, so we couldn’t rely on those existing, but it’s nice to have those gestures as a bonus, right?

00:31:32 - Speaker 3: Yeah, exactly.

00:31:35 - Speaker 2: So something was notable to me about the creative process here was that we started with a debug menu that had a long list of everything you could do that was kind of in the command gesture space, and then you would have options for each one. You could play a video with single click or double click or maybe some other thing. You could zoom to a board with this, this, or this.

And then you go through a very granularly set your whole command space, but of course it’s very easy to get things that collided if you said, I’m gonna single click to select, but also single click the zoom or something like that, maybe weird stuff would happen.

And then eventually based on those experiments, I think Leonard, you boiled it down to like 2 or 3 kind of groups that naturally fit together, you know, maybe one. It was more double click oriented around actions, and that was more single click oriented, and then you could ask folks on the team or users or whatever to try out these different setups and say what feels right to you. And I remember testing those out a little bit, and the real test of it to me is once you get it and you’re really trying to do something, do you get lost in the flow? Does it just start to come naturally and you stop thinking about it versus I have to stop and think, what do I need to click or press or hold or whatever to get what I want to happen to happen here.

00:32:51 - Speaker 1: Yeah, right, that’s really the key. It’s kind of about how everything feels when you use it together, right? You can’t really look at any one single thing on its own.

00:33:01 - Speaker 2: Now did you find that the existence of a cursor hovering over the canvas, was that sort of a major change in the sense of how the user experiences it, or was it more kind of minor?

00:33:15 - Speaker 3: To me personally, I regained my appreciation for correct cursor, how would you even call it? Correct cursor shapes. I think this is something that you don’t notice when it’s working as expected, but you definitely do notice when it’s not like if you’re trying to resize a card and you actually have the normal arrow pointer. It just feels like something’s broken.

You expect there to be the little reset arrows pointing in different direction. I think same with kind of hovering over a card that you can click like a link card. You just expect there to be the little pointy head that indicates this is a link that you can click. And yeah, it was actually a bit of work to get all of that right, but I do appreciate the cursor and like the subtle ways in which it cues the user as to what would happen if they interacted with this element.

00:34:07 - Speaker 1: Yeah, and I would say it’s actually quite underappreciated how much of another dimension the cursor adds. Like it’s not just sort of a translation of the touch input, but to me it’s also about the interaction that happens before the touch input like a cursor, you can kind of move around before you commit to an interaction. And so the cursor shape can change, so you can see what’s about to happen. You can even show additional information if you want to unhover. But you can also use that to let the user just make more informed and precise decisions, I would say.

00:34:42 - Speaker 2: Queuing what the user can do reminds me of one really interesting call it subtree in the interaction space that you both went down, which is this, I guess I would call it this sort of ghost card that you get when you’re adding content. I’m not sure if that’s what you both call it, but I’d love to hear about the decision to go that way.

00:35:02 - Speaker 1: Yeah, I’m not sure what we call it. I, I don’t know that it needs a name, but yeah, it’s basically while on the iPad, when you add something, it immediately appears in your inbox on the left side of the screen and you just drag it to where you want it to be. On the Mac, it works quite differently where you add something and then it adds this little preview to your cursor and it follows you around and you can kind of figure out where you want to place it first and then when you click, you place it on the board.

00:35:28 - Speaker 3: And it even has a little additional hidden feature, uh, which is that you can actually click down to place it and then move your mouse with the button down to resize it in the same gesture.

So this is, I think one of the little things where we felt like this is something that feels news on the iPad like, even though it’s obviously like a cursor base and very Mac specific gesture, but A little thoughtful touch and you know, what if you want to make this image really big, you would have to click and then go to the bottom right corner and resize it and we wanted to make it easier to do both in one gesture. So this is what you can do on the Mac.

00:36:10 - Speaker 1: Yeah, and this is something that really isn’t possible on the iPad, right? Like the iPad in that way has a very simple input system, whereas on the Mac with the cursor, you know, you can very quickly, especially in combination with the keyboard, add something, you find where you want to place it. Confirm that and then also set the size of the content. Whereas on the iPad, that would be multiple steps and take quite a bit longer. So in that way, I think it kind of maps to what we said earlier about the Mac that it’s just a lot more about speed and precision while the iPad is sort of about this direct connection to the content, which, yeah, it’s lacking when you use the cursor on the Mac to add something.

00:36:53 - Speaker 2: One thing I think they both have in common, whether something goes in the inbox or whether you drag out a new board from the left side on iPad with that kind of extra gesture, or whether you’re moving the ghost card for the new board or the image or whatever it is around to find where you want to place it.

In all cases, we really try to avoid putting content on the user’s boards in a kind of a random place. Basically everything on your boards is things you have decided where it goes. I don’t know how much that’s a, you know, an intentional core principle that you always try to adhere to or something that just naturally came out, but it’s an interesting because it feels to me like there is a shared common principle or set of values, maybe like you were referencing earlier, but how it is implemented per platform is extremely different because the input devices are so different.

00:37:42 - Speaker 1: Yeah, that is quite intentional and that’s a big reason for why it works that way on the Mac as well. Yeah, I think since new is the spatial canvas and it is really directly about your content, it is really important that the user places everything. And, you know, Muse is not a linear text editor or something where we can just add new things at the bottom and the user will find them there. But since it is a special place. We feel it’s important that the user. Always has to say about where exactly something is.

00:38:13 - Speaker 2: And we talked about how a lot of what is in our kind of in-app custom UI Chrome on iPad, on Mac, you have the benefit of stuff that sits kind of outside the canvas, and that includes the toolbar, but then there’s that. Very ever present top menu bar that every app has. You just got the little apple in the corner and then the name of the app, and then you’ve got file edit, and so forth from there.

Now, those all seem very, I guess standardized to me. They seem pretty similar across apps, but I don’t know if there’s really good established conventions there or they just seem similar because apps borrow from each other or I don’t know, what was our process for coming up with the new menus.

00:38:57 - Speaker 1: Yeah, I would say it’s a bit of all of that. Like there are actually a surprising amount of sort of guidelines and just Apple documents about how you should structure and order your menus.

So I also think many apps don’t really follow it that closely or kind of make their own decisions. So in that way, it’s, you try to look at everything other apps are doing. You try to look at what Apple is doing, and then you need to figure out what is the right call for use. And I think for us, especially the challenging part was that we also have this iPad app which doesn’t have any of those menus and we still need them to work similarly and so we can’t rely on only those menus.

But it did actually mean that the iPad also benefits a lot from what we did on the Mac. Like, for example, we introduced a lot of new sort of context menus, right click menus for the Mac version where you can click on a card, click on the background of your canvas, and you get really a lot of useful options which we didn’t have before on the iPad.

And the interesting part of me was also that there isn’t really like a single menu on the Mac, right? Like you have the menu bar, you have the context menus in different places, and then you have keyboard shortcuts that need to map to those menu entries. And so in that way, you’re kind of trying to build up this whole system of different menus that all need to make sense and sort of have the same structure and order for different content types, but also, you know, the different places, the menu appears. And I think the iPad version of news also benefits from that a bit, especially in regards to keyboard shortcuts. Like we always wanted to do better keyboard shortcuts on the iPad, but it’s never really been a focus for us on the, on the iPad. But now on the Mac, it really is table stakes to have menu entries and then have keyboard shortcuts set for all of them. And so we’re trying to bring the same thing to the iPad as well and really try to build up this universe of actions that work the same way across iPad and Mac.

00:40:54 - Speaker 2: Are the keyboard short guts between the two exactly the same, or are there places where they’re different?

00:41:00 - Speaker 1: Yeah, we’re mostly trying to keep them the same. I think the difference is more that there are some. iPad specific shortcuts and some Mac specific shortcuts, like for one, of course, you know, the iPad has like inking or something and maybe you want to do special shortcuts for that. But there are also a lot of system shortcuts that Mac OS just takes for itself, basically, and there are a lot more of those on the Mac than on the iPad. And so we have to be very careful on the Mac, not to touch on any of those, but we could use them on the iPad if we want to.

00:41:30 - Speaker 3: One other thing about menus that I realized and I think wasn’t quite consciously aware of before is that they actually also a great way to teach users what’s actually possible to do in an app.

I think on the iPad we spend a lot of time thinking about how to best teach all of these complex gestures to our users.

We had a few different approaches of onboarding, even popping videos into the inbox that will show two hands, you know, zooming into a card while also carrying another card with them and Since we don’t have any of that on the Mac, I think actually just that’s for me how I often learn how to work with an app is to just click through all of the menus and see what the options are there and learn about the capabilities and how to navigate the app. So it’s basically almost like a little on boarding intro for free.

00:42:22 - Speaker 2: And maybe like a sort of a table of contents of what you should be able to expect to do, and not just the verbs, but also the nouns in many cases.

So I think of something like an audio, you know, I use audio editor tools as part of podcast audio editing, among other things. And so you might go into something like audacity, and you go through the menus and you see there’s a whole section on adding and removing marks. OK, what’s a mark? Well, it turns out this is a way to put a little marker in the audio and possibly give it a label.

But you might, if you’re new to that, now you know that this is a noun within the world of the application and maybe what you should learn about or what you should read about in the documentation, or you just try adding one and see, you can kind of infer from, you know, the name and what happened, what it is. So yeah, there’s a lot of discoverability in those standard menus.

00:43:11 - Speaker 1: Yeah, and even Apple is actually really explicit about that being one of the jobs of the menu.

And so they really encourage you to build your menu in that way that users can, when they first start the app, go through the menu and basically get an overview of everything that’s possible.

And yeah, that doesn’t really exist in a system like that on the iPad, which I think is partly because, you know, the iPad is not as much a pro device, but Yeah, for people that want to make it a pro device, that makes it a lot more difficult to sort of do this kind of onboarding themselves and come up with another way of teaching users about everything that’s in the app.

00:43:50 - Speaker 2: I also give a quick shout out to the Mac help menu which has a default search box.

Muse has this as well, and so you basically just search menus for what you’re looking for and definitely for a lot of apps that I use, for example, sophisticated programming editors or video editors where they have so many options and the menus are often many layers deep, and I can’t necessarily remember the keyboard shortcut for an action I use infrequently. But I can go into the little help menu and type in something there. It’s almost like a little command line or, you know, spotlight quick search kind of thing.

In addition to being a quick way to execute a command, they can help you discover. So I’m thinking now, even just going to use for Mac right now and I type in, let’s say I’m looking for the duplicate option or I want to remember what the keyboard shortcut is or where that’s located in the menu. And if I type in DUP, I see duplicate, but I also see duplicate Inc, which maybe I didn’t realize before that there was an option just for that, but I discovered that by using the school search. Now another huge area from my perspective is drag and drop, and that’s also important in the iOS iPad OS world, but way more so in the world of, yeah, pro app workflows and big screens with you’ve got multiple windows on screen at a time and you’re moving data between them. And actually through this process, I think of watching the two of you work on this application, I’ve rediscovered some of my love for native apps, so I found myself using a lot more, especially maybe smaller, just utility apps, something like Transmit for uploading stuff to S3 or Optimage or Forecast is a little podcast kind of compression tool. But all of these, the drag and drop just works so beautifully. It’s just really nice that you can always drag from one tool to the next tool to the next tool and just always works the way that you expect, whereas I don’t know, with the web and especially like electron apps, you know, oh, someone shared an image with me in Slack, I want to grab that and put it someplace. Sorry, drag and drop just doesn’t work or it produces some unexpected result. I drag the thing out and I get some weird URL, not the actual image. So I’d love to hear about the design and implementation of drag and drop for Muse, including any challenges we might have run into along the way.

00:46:08 - Speaker 3: Yeah, so the way that drag and drop works on the iOS and Mac OS systems as well is that you basically define a set of file types that can represent the content that you’re trying to drag and that could be one file type, for example, just a piece of text, or it could be several. So you could say this text could be represented by just the text, but it could also be represented by a file on your in your file system with a TXT ending. And depending on the content type, there might be a lot more versions in which you can deliver this content to other apps.

And then on the other side, a receiving app can get this packaged item, look into it and try to figure out what possible format it might extract from it. So if you drag something into a text editor, it probably just cares about the text. For example, an app. Like notes on Mac and iOS can actually absorb all kinds of different content. You can drag images, you can drag files in there, you can drag text, URLs, and it kind of tries to be as smart as it can about what’s possibly the format that the user would expect in this case. But it also created some challenges for us in trying to basically outsmart that system if we feel like what the system determines as the desired content type is not actually what we are trying to provide. So in the coming back to the example of text, we thought it would be really cool if you could just select some text, drag it into your finder and have that write a TXT file into your finder. But if I then select some text, drag it into the notes app instead of the text appearing there, it’s actually a reference to the TXT file appearing. So in this case, nodes sees text and a file and it chooses the file over the text, unfortunately, which in the end led us to abandoning the file type altogether and just serving up pure text and Any app that can read text will then extract the text from it, but unfortunately, the finder doesn’t have the magic capability of turning it into a file by itself.

00:48:20 - Speaker 2: Yeah, it does feel like an edge case. I feel like most of the common situations with text, images, PDFs.

Those all work, I think pretty much as expected every time I’ve ever used them anywhere. Now maybe it gets complicated if I’m dragging 3 text cards and 2 images and I’m taking it to another place like a craft or notion or something that can certainly support those and what order should they go in and do all of them come across that sort of thing.

But for the most part I’ve found that there’s certainly especially single items and even on the web. So if I’m, for example, want to compose a tweet. With maybe an image attachment, and I do that and muse and then I drag that over to my web browser right into Twitter and it just does precisely what I expect. The text becomes the text of the tweet, the image becomes the image attachment. So I don’t know how much that was a huge amount of work on your part to like handle a whole bunch of special cases versus that 80% of the common cases actually do work pretty well out of the box.

00:49:17 - Speaker 3: Yeah, it actually wasn’t an unexpected amount of work, I would say so. It’s nice to hear that we ended up in a state where in most cases it seems to do what you expect, but to even figure out what’s expected in each case and kind of weed out all the edge cases and make sure that, you know, when you drag two items with a different type, they both arrive in the format that you expected. It was quite a bit of work to consolidate all of that, but yeah, we got there eventually.

00:49:44 - Speaker 1: Yeah, and I’m seeing now how much magic actually happens in the background when you drag and drop something like as a user, it always seemed to me as, you know, straightforward logical operation of you have the content, you put it somewhere else. But yeah, in reality, there’s a lot that the system does in the background to figure out what part of the content exactly you’re trying to drag, how to display that, and how to transition that and sort of the fact that it usually does pretty much what you expect. It’s kind of surprising if you look at how it actually works.

00:50:16 - Speaker 2: Well, I continue to think drag and drop is one of the best, is that the word for it, interactions in computing.

Because it’s something that both power users rely heavily on to do their work, but also people who are non-power users I’ve seen this directly where they just find it really intuitive, almost surprisingly intuitive, like, oh, I can just press and hold on this thing and pull it over here, or I can just click and drag this thing over here, and it just takes it from one place to another. So I think that’s an absolutely fantastic interaction in the computing world and almost underused in a way. I think there’s even more that we can do with it. So, certainly, I think we’ve really made that a priority in our implementation work as we go along, both on iPad as the drag and drop capabilities have gotten more sophisticated there with multi-window and so forth, but obviously on Mac, there’s, again, much more precedent and much more capability to use that. I’m curious to know just kind of what happens behind the scenes when I drag, you know, for example, a PDF from Muse to my desktop, my Mac OS desktop or Finder or another app. Obviously it needs to export that file and that data, but if I drag it to another Muse window, I’m kind of within the Muse universe, although I suppose there it could just do an export and another import. But I think at least if I recall correctly, at least on iPad, if you have two different boards open, it is the same as if you had done a move operation from one board to the other. How does that kind of work behind the scenes?

00:51:52 - Speaker 3: Yeah, so in our case, it’s actually not quite the same because we, at least on the iPad, we’ve decided to forego the default IOS level system drag and drop interaction because when we first started designing news. We took a really close look at it and since Muse basically dragging cards around on the board was kind of one of the most primitive core interactions of Muse and we wanted it to feel really good and really fast and fluid.

And the way that the iPad and iOS system drag and drop works is that you have to hold your finger down on an item for, you know, half a second or something like this, and then it becomes detached from its parent and you can drag it around, you can even then navigate to a different app potentially and drag it in there.

But it was precisely this tap and hold before you can actually move it that we were really bothered by and you couldn’t actually really customize this length. You could in a sort of hacky way in that work, you could reduce it to a very, very short delay, but you still needed to like set your finger down and hold it still for a fraction of a second before you could move something around. And we just felt like that wasn’t good enough for our vision.

So we implemented a completely unique and custom drag and drop system for moving stuff within the app.

And to then actually get stuff out of the app or into a different window, you have to engage Apple’s normal drag and drop. So in the way that it works on iPad OS news is that you actually hold down on the thing for, I think it’s, you know, a little less than a second and it plays a little lift animation and you can see that this is something happening that’s different from The normal dragging around and it becomes detached from the screen and moves beyond windows.

And in this instance, when you detach this object from the canvas behind the scenes, the code is basically asked to provide the representation of this item and how other apps can consume this. So we would package it up into like a little data objects or you could provide a URL pointing to a file on disk. But you can also provide your own local object that is unique to your app and only your app knows what to do with it. So this way we’re able to drag a card from one window to the next without having to convert it into a PDF file, write that onto disk and then on the other window, read it again from disk and turn it into a card. We can just reference the card object directly and kind of move it around a lot more efficiently. Without using any of the new internal data.

00:54:34 - Speaker 2: If I understand that correctly, it’s sort of it’s taking advantage of this multi-part content payload that comes with drag and drop. For example, you could provide a rich text version of some text, but also a plain text version of whatever app is receiving it can decide which format it understands or is more native, and in our case, we have like a muse specific bundle object thing that only our app is going to recognize plus a more generic export. Is that accurate?

00:55:01 - Speaker 3: Yeah, pretty much, yeah.

So in the example of a text card, for example, we bundle the text card as just a text thing that any old app will be able to unpack, but then also as a muse card that contains a muse text document that maybe has, you know, ink information attached to it or all kinds of other information that only Muse is able to interpret.

And the same actually applies to things like when you have a whole selection of different cards. This way we’re actually able to maintain all of the spatial information. So if you drop it into another window, all the cards still have the same spatial relation to each other, whereas if you drop it into, say, the finder, they’ll just all be written into files or, or if you drop it into notes, they’ll be just linearly below each other in the document.

00:55:54 - Speaker 2: One smaller point in all this is we did a little experiment to use test flight to distribute the Mac beta.

This is a relatively recent addition, test flight being the way you distribute kind of beta or pre-release versions of iOS apps, been around for a long time.

I would say it’s the gold standard, but I think it’s the only thing you can use. I don’t think there’s really any other way to do it.

Obviously, desktop computers have a history and tradition of, you know, you click on download and you get a. EXE on Windows or a. DMG on Mac. And so that, I think continues to be pretty common for a lot of productivity software, but we decided to to give that a try for the beta here and yeah, what was our takeaway? Was that a good call or do we wish we’d done the more direct binary download?

00:56:41 - Speaker 3: Yeah, I think we did the more direct binary downloads in the beginning, which usually resulted in me like posting a file into our Slack and then everyone having to download it and open it from then. I think that worked well enough then. But as soon as we started using test flight, I think everyone was just like, oh, it’s amazing, finally I get automatic updates and I don’t have to keep downloading your stupid files every time there’s something to look at.

00:57:06 - Speaker 2: Yeah, well, historically Mac apps, they often do auto updating, but I think there’s a library, maybe it’s even like third party, trying to remember, I know our colleague Adam Wulf has done this before, but you basically need to build in the updating behavior to the app, which can be done with third party library, but it’s a whole other piece of infrastructure, whereas I guess with test flight, we put it in there and then the auto updates just kind of quietly happen behind the scenes.

And in fact, I think from a user perspective it’s even better than what you’re used to in those standard Mac apps where you tend to get this thing where you run the app. If it’s been a little while and there’s a new update, it says, oh, I’m downloading the new update, you know, quit and restart, and then there’s this brief disruption of your workflow whereas test flight, I guess it’s built into the operating system, so it just kind of happens in the background. I got to run Muse and I’m just on the new version. Yeah.

00:57:54 - Speaker 3: And in addition to that, I think test flight is also just, you know, it’s basically built for distributing beta software and also to collect feedback.

So once we launched the public beta, we were able to, when we have a new update, add a couple of release notes that say, here, this is what changed since the last version, this is what we’d like you to test.

It’s also a tool basically to communicate with your users in a way. So that worked really well.

I’m generally very happy with test flights for the Mac.

I think it’s only available in the very latest operating system and I know that especially with the Macs, people tend to not be updating them so often. I think it with iOS, the adoption rate of like a new major release is a lot higher than with the Mac right away, but we decided to Take that risk and it turns out that most people that we asked to test were either already on the new OS or we’re happy to upgrade for this purpose.

00:58:51 - Speaker 2: Yeah, I think I ended up upgrading for this reason, which is I think it’s usually what happens to me with Mac, which is I don’t tend to update just because when a new thing comes out.

Usually there’s some specific reason, some feature I want, some app that won’t run, and I’ll go, OK, well, I’m 2 versions out of date. I guess I’ll go ahead and get up to date.

I know there’s a lot of that with Xcode as well, that if you need the newest X code, you need the newest Mac OS and maybe you need the newest X code to build apps for the latest iPad OS for example. So Apple stuff does tend to have that kind of, everything is connected and you need to be on the latest version to take advantage of stuff and test flight here is one example of that.

00:59:31 - Speaker 1: Yeah, and I think it’s also a really important step from Apple’s part to kind of try to make developing for the Mac as easy as developing for the iPad since, yeah, especially the last few years, sort of Apple has focused mostly on iOS and the iPad.

It’s focused mostly on iOS and the iPhone and developing for that, making that experience really easy.

And so now if they want people to take those apps and put them on the Mac as well, you know, they have to make sure that they don’t confront developers with 10 other new things they also need to do like worrying about beta distribution or how to share updates with their users. And so I think that sort of stuff like having test right now just really enables us to very quickly get the iPad version onto the Mac without, you know, a lot of overhead.

01:00:17 - Speaker 2: Yeah, that’s always the best case scenario for any platform that someone might want to develop for, which is focus on your app and the specific things that you’re doing, not the infrastructure that goes around it, and certainly how you get builds out to people, how they install it, how they update it, how they decide they’re done and remove it, all of that is infrastructure that I think is basically pretty standard and the same across all apps and not really. Something differentiated, so actually it does make a lot of sense to have that be part of the operating system. So we’re happy with test flight. I think we mentioned earlier that we’re happy with Catalyst, but probably there’s some pros and cons. What’s your overall reflection, Julia, or either of you after having worked so closely with this technology for the last 6 months or whatever it’s been?

01:01:06 - Speaker 3: Yeah, I mean, I’m sure like all young technologies catalysts still has a long way to go, but for the most part, I was really quite impressed with how low of an entry barrier it was for me as an iOS developer to just get into Mac development.

First of all, like I said earlier, the app basically worked pretty well out of the box just by clicking one check box and building it for the Mac. So that was quite encouraging for us.

Of course, it still took a lot of time to tweak everything and really make it feel at home on the platform. But for the most part, Catalyst really was the right tool for us to use. There’s a few things that we had to work around or we just had to acknowledge that it’s currently not possible or it would not be possible without a huge amount of effort. So, so there’s some compromises that we had to take. But I think if we had actually tried to build news on the Mac, you know, from the ground up with AI. First of all, that would have been quite a new framework for me to learn and probably would have taken us a year or two to get to an app that’s, you know, even close to what we have now. So I think in general, Catalyst is a great tool and all it takes is a lot of patience, attention to detail to really make an app that feels like it belongs on the Mac and of course, be open to compromising on some small things that are just not possible with the technology at the moment.

01:02:35 - Speaker 1: Yeah, I’m also really happy with it. I think going in, you know, we had a lot of doubts and open questions just because there aren’t that many catalyst apps and most of the apps that we really like on the Mac are not catalyst apps.

And so in that way, like from a design perspective, it was also a bit of a challenge because I’ve spent the last two years working on this iPad app and now I’m sort of trying to get into the Mac OS world and trying to figure out what the conventions there are. And so one of the primary resources there are Apple’s guidelines.

But those are not really written with catalyst in mind. You can look at all of those, but you should expect that like half of the stuff that’s described there just isn’t going to be implemented at all in Catalyst or is implemented like only halfway in like a bad version and then it’s sort of an uncanny, really, like if you have a certain menu type that Apple has implemented in Catalyst, but it’s not quite the same as the fully native one. And then there are like a few tiny details that can be quite frustrating if you really care about that stuff. But yeah, so far I think we have mostly found ways to sort of work around that and get it up to a standard where we are really happy with it.

01:03:49 - Speaker 3: I hope that we’ll still be able to get the title editable in the Mac toolbar. I think that was one of the biggest caveats that we had to live with for now is that you can have a title displayed there, but you can’t actually have any UI that allows you to edit it in place.

01:04:06 - Speaker 1: Yeah, the toolbar is a good example of something that is theoretically implemented in Catalyst, but it just doesn’t give you many options at all. And if you really want to have a functional toolbar, you kind of need to build your own thing, which then you want to, which then you try to make look as close as possible to the MacOS one. And yeah, that’s always a frustrating experience if you have to do that.

01:04:29 - Speaker 2: I will echo being generally pleased with the results of Catalyst more from the internal test user perspective.

We talked about this in our native apps podcast episode, but It is very often the case that there are cross-platform frameworks or the pitch for Catalyst, I guess, has some of the same qualities of something like React Native where we say, OK, kind of right once run everywhere or right once run in many places, and the downside to that is it’s often feels native in only one place or it doesn’t feel native any place, it feels sort of off everywhere. And so I was kind of worried for that and on the lookout for that, but except for every once in a while where it pops up, where there’ll be like a default dialogue styling that just looks like iOS and like, where did that come from? For the most part from the user perspective in terms of how it feels, I don’t feel like I’m running some kind of translation layer. It feels completely like a native Mac app to me, which I think is testament to Catalyst, but also to the hard work you both put into making sure that it did truly feel native.

01:05:34 - Speaker 3: Yeah, thanks. It’s great to hear.

01:05:37 - Speaker 2: Well, as we look overall on Muse for Mac and how it sits side by side with Muse for iPad and is we hope one unified product, not something that’s being ported to multiple platforms, but indeed with the sync, it ties it all together, so it feels like a fairly unified, and sometimes we talk about the medium for thought idea and these different devices are just different.

Inputs into that system.

And in practice, I think through the beta we’ve certainly heard that that is the case. We hear about lots of interesting use cases like certainly not just the case of, OK, I work on my Mac for a bit and then I work on my iPad, but people may actually use them simultaneously, you know, you might have them use for Mac up so you can use it on a Zoom screen share while you’re simultaneously scribbling onto your iPad and that kind of syncs across seamlessly. But I think that’s kind of the big bet we’re making here and we’re making this big investment, again, really big investment for such a small team as ours is to make two apps for these two platforms that share this core design language and values and concepts that are Adapted to the very different purposes of each of these platforms. The iPad, of course, relaxed and kind of free flowing and intimate and the Mac for productivity and focus and bulk editing and so on. So I guess how do we feel about that bet paying off? Is this idea one that we think that, you know, maybe others will adopt and If we prove it out, that that could be a new precedent, or are we, as usual, kind of doing our own weird thing, and others aren’t necessarily gonna wanna follow that path.

01:07:18 - Speaker 1: Yeah, I’m still really excited about that approach, trying to really position the different platforms as something that you use together and that each have different functions instead of necessarily trying to replace either one or the other.

And I think that’s something that a lot of apps are kind of inherently kind of end up doing, but I think it’s really interesting that we are trying to be explicit about it and really trying to focus on that and the design of the app for each platform. And so, so one way to look at it is that I think often the assumption is that you build an app for different platforms like for the Mac, the iPad, and the iPhone just because the user will have different devices with them at different times, like if they’re at the desk, they’re going to use the app on the Mac. If they’re on the go, they’re going to use it on the iPhone. But I think what I find for myself is that often actually when I’m on my desk and I’m using Muse, I’m thinking, OK, I actually want to use Muse on my iPad now. Like what I’m doing right now is actually better suited. It’s actually better fitted for the iPad. And so I like stand up, take my iPad, go to the couch, and then that’s explicitly a different experience that I couldn’t have on the Mac. And so it’s not just about which device is available and which device does the user have. But it’s really about sort of the flexibility you get to change modes depending on what you’re working on. And I think that lends itself especially well to use, of course, like we have this approach to being a thinking tool that really focuses less on productivity and more on sort of the creativity side of it. And I think they are the iPad is sort of the perfect device of sort of being able to immerse yourself and then, you know, just being able to explore and sort of what I would call like a free roam experience. Versus thinking on the Mac is sort of all about getting things done and about speed and efficiency. And I think if we embrace that we will naturally end up with apps that are quite a bit different.

01:09:29 - Speaker 2: I might also argue that I think there’s a general sense that you want to own or carry less devices with you, maybe that’s part of the appeal of the phone, especially for newer generations, you just try to do everything on your phone and that’s it. It’s super mobile and it’s always in your pocket and it’s secure and etc.

But I think one of the maybe, let’s call it user research insights we were starting from here is that creative people, creative professionals do tend to own. Not only multiple devices, but maybe a lot of them, you know, maybe they’ve got the big monitor and the keyboard and the docking station, but they’ll also detach their laptop and take that on the go, and they may have the iPad, which may be more aspirational for creative work. Sometimes it ends up out of batteries in the drawer as more of a consumption device, but they may have it for that purpose as well as the phone obviously, but you might also have a Kindle for reading and you might have other kinds of specialized devices.

So that’s on one. And I think there’s maybe a desire in some way to see kind of all different devices and platforms kind of merged together into one like touch merges together with desktop and desktop becomes portable and so on, but it felt to me in researching this and looking at how professionals really do live their lives and what the devices on their desk look like is we do own a lot of devices. We do want them for different purposes in different scenarios. But there I think we tend to get this really strong app siloing right where it’s like, OK, well I do use the iPad for putting together inspiration with mood boarding, but I have a specific app I use for that. It’s only on the iPad when I’m done, maybe I take a screenshot and send it someplace else. It’s not a very fluid thing. It’s kind of just on this one device and then similarly, I have my desktop productivity tools, those live there and nowhere else, and that’s it. And so I think part of our idea here is that you can Exactly as you were saying there, Leonard, that there’s this switching modes that can happen seamlessly and with very low friction. You’re not thinking, oh, this would really be better on the iPad, but there’s a whole big process to get over there and can I even do what I want versus because we have this sinking behind the scenes. Because we do have the shared kind of data model and conceptual model, it is actually a very light touch or it is a very small step to say I actually want to be in a different posture, a different attitude and a different kind of interaction paradigm with my work, maybe only for a minute, maybe only for 5 or 10 minutes, and then I maybe I switch back to the other one, for example. So again, that’s part of the big experiment we’re doing here is, is there enough value to that that folks will really find used to be an unusual tool in their workflow and something that unlocks new creativity.

01:12:16 - Speaker 1: And a big part of that really lies on Apple, right? Like we as developers can only do this single app, but in the end, it’s Apple’s sort of messaging and product strategy that decides if this sort of approach is successful. And I feel like Apple hasn’t been super clear about the messaging, like they’re always saying, OK, the iPad and the Mac is not going to merge, but it kind of feels like it does over time, just like very slowly.

01:12:42 - Speaker 2: Or you could argue there’s something like a superseding, right? There’s the what’s a computer campaign, which I thought was quite clever, which basically points to, you know, younger people who grow up with touch devices and they’re able to do so much on an iPad, including their homework and creative tasks and things like that, that they don’t need one of these clunky old things with a keyboard and mouse that’s for old people or whatever, and that’s kind of the idea that comes through in that.

But then at the same time, iPad is not now, and I think certainly I know it would be Mark’s position to say not ever going to be in a position to supplant the power of a true desktop with files and the ability to program it and a keyboard and so forth. So, yeah, what’s the future there?

01:13:26 - Speaker 1: Yeah, I think the trade-off is often painted as either the iPad is able to replace the Mac for most users, or the iPad has to stay this media consumption device.

And I feel like there’s another pathway, the iPad can be a pro device. It can be very productive and used for work, but it can coexist with the Mac. And I think for Apple, that kind of means thinking about the interface differently as well. Like what I’m seeing a lot with some the new iPad OS features every year is then basically taking Mac OS features and trying to adapt them on the iPad. And, you know, they often do a really good job with that, like with the new cursor, you know, they kind of reinvented the cursor on the iPad and really improved a lot on the desktop version. But in the end, like if they’re always just taking parts of the Mac and Like 80% adapting them and then making them a bit better, you know, that’s not going to lead to a really different device or really different workflow. What I would be really interested in is seeing cool features that are sort of start from the ground up for the iPad and that maybe don’t even have an equivalent on the Mac.

01:14:32 - Speaker 3: Yeah, I think for me personally, I obviously have been working on this app for a long time and the iPad certainly has a place in my life, but having grown up with being a laptop user, having learned to type really fast, using keyboard shortcuts, like, you know, muscle memory style, the iPad has never quite made it. Of the line of being like a really productive tool for me.

So yeah, from being able to use new on the Mac and being able to enjoy all of the speed that comes with it, the cursor support, the keyboard shortcuts, the much more seamless, uh, interaction with other apps, getting content in and out really fast is something that I’m quite excited about and that I hope maybe in the future.

Other apps that I like or love on the iPad will be brought to the Mac and log new workflows there.

01:15:26 - Speaker 1: Yeah, and I think that’s actually also a really important perspective that even though we are saying, OK, we have the Mac and the iPad app and they fulfill different roles, like those roles will look different for different people and some people might use the iPad still 90% of the time and only use them. Mac version for very specific tasks and others will like start with the Mac and only have the iPad sort of there to go version of that. And I think that’s fine as long as we are kind of aware of those different use cases and sort of try to make each platform reach as far as it needs to.

01:15:59 - Speaker 2: Certainly, I think a part of this is an experiment to discover what indeed will people use.

Use for Mac for, what will they use, use for iPad 4, to what degree will they use them together simultaneously, to what degree will they maybe use one almost exclusively and the other very little, and I think that will vary a lot by user, but I think it’s also a discovery process. Basically, I think our whole industry has been trying for decades to figure out how the tablet fits into our productive and creative lives.

And that’s sort of ongoing, and now I think we can say that uses doing our small part to help discover what the future of computing might look like in that area.

01:16:40 - Speaker 3: Strong closing statement.

01:16:42 - Speaker 1: Yeah, that’s a good ending.

01:16:44 - Speaker 2: Let’s wrap it there. Thanks everyone for listening. If you have feedback, write us on Twitter at @museapphq or via email, hello at museApp.com. You can help us out by leaving a review on Apple Podcasts. And Leonard Yullia on behalf of myself as a user, but I think probably lots of other users and customers out there, thanks for the incredibly hard work and attention to detail put into bringing the new experience to Mac. We’re all pretty excited to have it be part of our ideation workflow.

01:17:16 - Speaker 3: Yeah, looking forward to see what people say about it.

01:17:20 - Speaker 1: Yeah, I can’t wait to see it actually in the hands of users and seeing how they use it.

Discuss this episode in the Muse community

Metamuse is a podcast about tools for thought, product design & how to have good ideas.

Hosted by Mark McGranaghan and Adam Wiggins
Apple Podcasts Overcast Pocket Casts Spotify
https://museapp.com/podcast.rss