|Five Current UX Trends||https://cleverlance.com/en/blog/Pages/UX-trends.aspx||Five Current UX Trends||<p>In 2009, when we issued the first yearbook on User-friendly Interface in the Czech Republic, there were only about twenty professionals focusing on the area in the Czech Republic. Today, according to data from Glassdoor, a global leader in the area of information about jobs and employment trends, the position of UX designer is the 25th most sought-after job world-wide. Google’s entry into the world of education resulted in more than half a million certificates being issued. The field is still evolving and so are the requirements and outputs of these creative activities. Let’s have a look at which ones we found the most interesting at the moment.</p><h3>1. User interfaces as brand bearers</h3><p>The more we can learn about users from research, the more time and resources development teams have to focus on the form and consistency of user interface outputs. In order to achieve the best possible user experience, all outputs need to have a unified design –companies cannot afford to use a different tone every time they speak to the client. Still, many companies only focus on the appearance and controls of the application, and at best perhaps also on the corporate colors. However, the brand essence carried by the user interface is an important communication dimension not only towards clients but also within the companies themselves. One such example is Nike, whose user interface is always easily recognizable already at first glance.</p><h3>2. Design system as part of UX delivery</h3><p>In order for a brand to maintain its uniqueness, it is not possible for the look of individual technological solutions to depend only on the supplier or the platform. Design System Management (DSM), which is a set of standards for large-scale design management, is becoming increasingly popular. The tools created in this field provide non-stop access to the current design manual, even to third parties. This ensures visual consistency across various interfaces and digital channels.</p><h3>3. Increasing reliability through user experience</h3><p>Just like before, users who make a purchase get not only the product but also the buying experience, only this time in a digital environment. The fact that the consistency of the appearance and controls are crucial for the user’s relationship to the brand has been confirmed by many studies. For example, according to Adobe Trust Report (The digital economy is personal, 2022), 57 % of users claim that as soon as a company breaks their trust, they will not give it another chance. 70 % of users then stated that inaccurate personalization reduces their trust in the brand. This goes hand in hand with the handling of customers’ data. Here it also holds that the more transparent the digital approach to the processing of the customer’s data, the more willing the customer is to share their date with a company they trust.</p><h3>4. Typography – small details with fatal impact</h3><p>Typography is a small but crucial detail for branding. The more text is transferred to digital equipment, the better the required quality of the text. A UX designer has to keep in mind that today text is used in various situations and places: a driver watching the dashboard in a car, a jogger setting the pace in their application, someone reading the news on the phone during their morning commute, a warehouse operator checking the data from a scanner, an operator configuring a machine in a production plant… All these situations have one thing in common: the user often holds the device in their hand and reads from a shaky screen. This of course places higher requirements on the font than when used on a static device. At Cleverance we often create new fonts on the basis of the client’s requirements to ensure legibility and readability of the text in highly demanding conditions.</p><h3>5. Limits of internal UX teams</h3><p>Many companies that desire to forego UX consultants and create their own internal UX teams quickly find out about the limits of such an approach. The expectation that a single internal UX expert can cover all UX skills often proves to be unrealistic. Sub-fields of UX such as research, copywriting or design are so specialized that not even one expert can manage to master all of these on the required level. This is why many companies now have UX teams with several members, and it’s becoming more difficult to find specialists with the expected skill levels and experience on the job market. As a result, the internal team often focuses on only one area of UX, for example research. However, without a high-quality designer, this only leads to theoretical results. A significant handicap of these teams is their narrow (albeit understandable) specialization on a specific product of the company. Designers and researchers then lack a comparison to, insights from and the best practices for other areas. That is where technological companies come in again, as the variety of the projects they handle can enrich companies with new approaches and nicely complement internal UX teams.<br></p>|
|The Automatic Testing Machine||https://cleverlance.com/en/blog/Pages/Automatic-Testing.aspx||The Automatic Testing Machine||<p>No universal testing program exists for automated testing. Every project needs its own unique script which is created based on an extensive expert assessment. Before each new project, an exact calculation must be made of which type of testing is optimal, effective, and more economical. Only after this can automated testing and a testing robot enter the picture.<br></p><p>
“Generally speaking, testing automation speeds up the process, that’s evident. And thanks to automation, several different scenarios can be tested, the scope of the tests can be expanded, and each of them can be performed identically because robots perform scenarios absolutely the same way each and every time,” says Tomáš Mertin, an automated testing system developer at Cleverlance. As a result, testing and code writing are essentially simultaneous, and developers can fix any errors or inaccuracies within a short period of time.</p><p>
“In recent months we’ve witnessed an increase in interest in automated testing, it’s a trend, but everyone’s expecting that it will reduce the number of people involved in development. I don’t think that’s going to happen,“ says Mertin. “Automated testing will definitely speed up development. It also provides us with better knowledge of the state of the application at any given moment. But tests, deployment, operations – someone still has to maintain all that. The human dimension is going to stay,” Mertin explains, adding that he thinks automated testing will not fully replace humans. “But it will save time, which they can then spend on actual development.”</p><p>
Automated testing frameworks have proven successful in segments where development is constantly underway. Like the banking sector. “We’ve got a big project in which we’re practically building the entire digital banking framework. One phase has to precisely dovetail with the next one. These days agile management is used for things of this size, which makes it all possible,” says Jan Vajsejtl, who is in charge of testing at Komerční banka, one of the largest banks in the Czech Republic.</p><p>
In the past, large companies like banks used waterfall testing. Testers would receive completed sections while work on development would halt because the developers waited to hear what they needed to fix. If any major intervention was needed, it was followed up with another phase of testing, prolonging the work.
In the past two years, automated testing has been added to conventional, time-tested, and efficient testing methodologies. It’s proven successful wherever development is practically non-stop. The experience with it has been exceptionally good, says Komerční banka’s Jan Vajsejtl.
These are cases where automated testing makes a substantial difference. “Our experience is exceptionally good. The automated system my colleagues and I fine-tuned for our own needs allows us to test practically all devices and environments, cell phones, websites, and more,” Vajsejtl says.
Although the inside of the system is complicated, its use in practice is surprisingly easy. “I think the main advantage is that it’s essentially very simply written. So just a short, half-day training session is enough to be able to start to use it. You definitely don’t need to know how to program or have some deep technical knowledge.” </p><p>
“For me it’s a testing success. We’ve put the testing framework to practical use and tried it out; my colleagues at Cleverlance and I tailored it to Komerční banka’s needs and augmented it with additional functionality. Given the amount of development we have, it’s a really efficient thing,” Vajsejtl.<br></p>|
|When you say analyst, it means that...||https://cleverlance.com/en/blog/Pages/analysts.aspx||When you say analyst, it means that...||<br><div><p>From an IT perspective, being an analyst means several different positions with various job descriptions. These includes the role of IT analyst, which is also commonly referred to as systems analyst, or business analyst, data analyst or test analyst. But what do people who hold this position actually do and what are they responsible for? Let’s have a look at the roles chronologically as they enter the IT project.</p><h3>Business analyst</h3><p>The first analytical position to join the project is the business analyst. Very simply put, business analysts are responsible for communicating with the client, with business representatives on the customer’s premises. The aim of their work is to collect the client’s needs, transform them into requirements and rank them in order of importance. Subsequently, analysts develop a solution design, i.e. de facto build the software from the user’s point of view. They record their design in the business analysis, which means creating process diagrams, Use Case models or User Stories, activity diagrams, describing user roles, drawing wireframes of screens and so on, i.e. everything that will show how the system should work from the user’s perspective. You can read what a good business analyst needs to know here.</p><h3>IT analyst</h3><p>IT analysts, also known as a systems analysts, enter the project process early or together with the business analyst. Their responsibility is to design the technical solution of the system. In their work, they communicates intensively both with the IT architect, who is responsible for designing the concept of application development, and with the business analyst, who presents to them functional requirements and a description of the solution from a business perspective. The IT analyst then designs and describes the details of the technical solution, individual system modules, data and object structures including their links, defines interfaces and models sequence diagrams, etc. The results of work performed by the IT analyst, together with the results of work performed by the business analyst then constitute the specifications according to which the developers program the required system. This is also why a standard requirement for IT analysts is that they are familiar with programming languages such as Java, .NET, SQL or XML. Knowledge of methodologies such as RUP and ITIL or the recently widespread DevOps approach to software development.</p><h3>Test analyst</h3><p>Test analysts process the test analysis. They study the inputs provided by the business and IT analyst and go through the processes and logic of the entire expected solution with them to understand how the system should work in the end. This means that they enter the project either after the business and IT analyses have been elaborated or before their completion. After familiarising themselves with analytical documents, they develop test scenarios (Test Cases), test suites (logical groupings of tests which are related in some way) and test scripts. It may also happen that during the creation of test scenarios, they come across a deficiency in the business or IT analysis. In this case, they will draw attention to this fact so that the business or IT analyst can incorporate the identified deficiency in the analysis. Test analysts also defines the necessary test data for testing of the software during the creation of test scenarios. In the end, they are able to propose a test plan, i.e. the order of testing the individual test scenarios. Sometimes they are also the ones who prepare the test data or participates in the software testing itself.</p><h3>Data analyst</h3><p>Data analysts, as the name implies, work with data. Each system contains thousands, sometimes millions, of data records from which a wealth of interesting information can be extracted for business purposes. This concerns numeric values, but also text data. Data analysts works with both primary data sources, i.e. data from the main system, and also secondary data, for example, data from systems which deal with less important, i.e. supporting processes. Analysts sort, clean and analyse the data using standard statistical tools. They create various types of reports and visualisations for business or management. They design and create relational databases, define correlations and patterns in complex datasets. The primary skills of a data analyst include database design, familiarity with data warehouses and BI platforms, SQL, data mining, and the ability to visualise the resulting data and present the results. But also knowledge of statistical techniques, mathematical knowledge and orientation in the field of finance. In fact, data analysts can join the project at any time. They can be part of the team almost from the very beginning, for example, if the project involves migration of data from the original system to the new one. Or they can get involved in the project after the system is deployed in production to extract and process the first outputs for the client’s business or management, while continuing this work and continuously preparing various reports and visualisations.</p><p>As can be seen from the description above, several analysts are involved in creation of the system design, and their work builds on that of each other. This is one of the reasons why ongoing, more or less intensive communication is important for everyone. Actually, designing new software could be described as a performance given by a symphony orchestra, with the violin accompanied by the flute or the oboe, with the occasional horn or timpani. If everybody is in tune, they create a beautiful melody, and if not, everybody has to cover their ears. In the case of software, any “wrong notes” would result in a non-functional solution which would not meet the client’s needs and, moreover, would probably not be usable.<br></p></div>|
|Unexpected bad practices||https://cleverlance.com/en/blog/Pages/unexpected-bad-practices.aspx||Unexpected bad practices||<p>Some programming practices are so familiar to us that we use them automatically without much thought. Sometimes these techniques become obsolete; sometimes they are applied in the wrong context. Addressing such poorly experienced habits is often met with revolt. Especially by those who use them and perceive the topic as useful, so let's do exactly that!</p><h3>Marks<br></h3><p>Programming IDEs often recognize specific types of comments to help navigate across the codebase. Xcode’s <em>FIXME</em> lets other developers know that a piece of code deserves more attention. <em>TODO</em> is helpful when something is, well, to be done. <em>MARK</em> differs from the previous cases; it serves a documentation purpose. The same feature in IntelliJ IDEA/Android Studio is called region.<br></p><p>Marks divide the source code into multiple parts using headings. That can make the code appear broken into logical units. If you are a reader familiar with the former Objective-C era of iOS development, know that this is just an updated <em>#pragma mark</em> directive.<br></p><p>Typical usage is in files with a large number of lines. <em>Marks</em> create the illusion of clarity by breaking them into pieces that supposedly belong together.<br></p><p>The usage of marks in such cases is a bad practice. Developers often abuse them to justify a file being too big. One should not depend on Xcode to make the code comprehensive and readable. Small and well-decomposed classes are more straightforward to reason about and navigate without IDE features. Especially for pull request reviewers using the web interface where those features are absent.<br></p><h3>Extensions</h3><p>Modern programming languages such as Kotlin or Swift allow you to extend classes, interfaces/protocols, structs, or enums to provide an additional implementation. You can divide your code into multiple pieces using extensions to outline what belongs closer together. Another usage is to make a convenience extension around another type you might not even own to make its use more intuitive. The possibilities are almost limitless. This isn't always a good thing, but first, a peek into history.<br></p><p>Extensions existed way back in Objective-C as well. If you're not blessed with experience with programming in such a language and had to guess the name for extensions, you'd likely be surprised. It's Categories! Another surprise is that Extensions are a thing in Objective-C too, but serve different purposes. What's interesting is the difference between both languages. Categories in Objective-C forced the developer to come up with the name. That's why files named in style <em>Class+CategoryName.swift</em> are often used even for Swift extensions. And more importantly, to use Categories, you had to import them explicitly.<br></p><p>Extensions in Swift are an unnamed piece of code. Such code may be more complicated for the reader to grasp. If multiple extensions of the same type exist, adding a name to the code and wrapping it in a type might help readability immensely.<br></p><p>Improper extension of widely used types causes namespace pollution. It's critical, before creating extensions, to ask whether all instances of the type should have such an ability. Should all UIViews have access to a blinking method? Does one specific subclass of UIView make more sense?<br></p><p>Some developers use extensions to break down the implementation of multiple protocols: which might also be a warning sign. If a class implements many protocols, it may be time to consider splitting it into smaller classes.<br></p><p>For trolls out there: you can make your co-workers mad by extending <em>UIView</em> with <em>translatesAutoresizingMasksIntoConstraints and watch them compare it with translatesAutoresizingMaskIntoConstraints.</em><br></p><p>But don't.</p><h3>Comments<br></h3><p>The ability to write comments might lead undisciplined programmers to create code of poor quality. Unfortunately, it's easier to neglect to name a variable and describe what's going on in my head with a complicated but not-so-clear comment. Easy should not be our goal. Brevity and clarity should.<br></p><p>Great comment for poorly written code is still a code smell. Don't just take my word for it. Robert Martin states: "A comment is a failure to express yourself in code. If you fail, then write a comment; but try not to fail."<br></p><p>Another reason is as the code lives in the repository and is modified and refactored, its behavior might change, and its name can express it everywhere it is called. But its comment is rarely updated and may become more confusing than helpful.<br></p><p>Documentation comments serve their purpose very well when you're designing an API for others to use. Remember that the API needs to stand by itself, and clarity is the priority. Don't use the documentation comments as an excuse for a lousy design.<br></p><h3>Structure</h3><p>The structure of a project is one of the first things you see when you check out a codebase, and it should outline the app's purpose at first sight. However, it is not an exception that some projects have folder structures inspired by the layers of architecture, e.g., View, ViewModel, Model.</p><p>Project structure based on architecture layers is a bad practice. It makes reusability effectively impossible. Navigating through such a structure is unnecessarily complicated and becomes harder to maintain as the scope increases. It doesn't scale. Folders inspired by the architecture might have their place, not just at the top level. It should not be the first thing you see.<br></p><p><img src="/de/blog/PublishingImages/Articles/MobileIt/unexpected-bad-practices-01-01.png" data-themekey="#" alt="" style="margin:5px;" /><br></p><p>See for yourself, what structure tells you more about the application?<br></p><h3>Dependencies</h3><p>Open source offers many libraries to simplify life, from UI components through networking to dependency injection solutions. It can often save a great deal of time and effort. On the other hand, this carries various dangers and limitations; using third-party libraries requires discipline, order, and balance.<br></p><p>Randomly scattered third-party dependencies significantly reduce robustness. Shielding the core of the application and using the libraries from the outer parts of the architecture helps mitigate the risk. Abstraction eases the process of replacing one library with another.<br></p><p>It's OK to utilize 3rd party dependencies, but with caution. Ask yourselves: How much time will it save me? How much effort will it take to replace? Can I install enough defense mechanisms to protect the application?<br></p><p>The silver bullet to protect your app, though sometimes tricky or impractical, is to have the import of the dependency in only one place.</p><p>We've had the pleasure of taking over multiple apps that were impossible to maintain anymore due to this problem. Without abstraction, no longer supported (or closed sourced) libraries disintegrated the codebase. External dependencies should never hold your product hostage.<br></p><h3>Tests</h3><p>Test-driven development is a programmer's good manners, a discipline overflowing with benefits. Technical impacts are a blog post by itself, if not a series. Non-technical impacts such as easy onboarding of new team members and executable documentation that cannot become obsolete speak for themselves.<br></p><p>Yet they are often neglected. A complete absence of tests is the apparent first and most common violation, followed by writing tests after the production code, which mitigates all the benefits and introduces other obstacles.<br></p><p>You must write unit tests first - before production code. Testing first will prevent you from creating code that's too complex. It will guide you through building components of the right size. The big classes are challenging to test, and the tests will direct you to decompose them into smaller ones.<br></p><p>Tests written after production code are inherently lower quality and can even be misleading. Unless you write the production code as proof of the first failing test, it is impossible to say whether the tests assert what they declare. It is then questionable how well such tests protect the system under test.<br></p><p>If you write tests after implementation, you may find a component challenging to test, which is impossible with a test-first approach. You can't create untestable code!<br></p><h3>The devil's in the detail</h3><p>Even the mundane can be harmful if we do something too automatically and with less attention. Challenge the ordinary and seek bad practices that you wouldn't expect.<br></p>|
|May dedicated to design||https://cleverlance.com/en/blog/Pages/graphics-for-children-II.aspx||May dedicated to design||<p>We dedicated all of the Wednesdays in May and the first one in June were to graphics and design. The QUB creative department prepared a five-part course in graphic design for children. The lessons were set up as five one-hour online sessions and we were thrilled with what the 11 little designers aged 7-12 managed to accomplish during the course. Incidentally, you may have <a href="/en/blog/Pages/graphics-for-children.aspx" target="_blank">read the article by twelve-year-old Viky</a>.</p><p>We started with the basics - we explained the basics of colour theory, went through some interesting facts about creation of pigments and finished the first lesson with the creation of a colour palette, which is essential for the start of any design project.</p><p>During the second meeting we touched lightly on history. From pictograms, hieroglyphs and cave paintings we made it all the way through to division of typography into expressive and functional and the children learned, among other things, to distinguish between serif and sans serif fonts. Be prepared for your little designers at home being able to amaze you with interesting facts about font construction and correctly state that some characters have bellies, tails and eyes and that a pin needn’t necessarily be the one you use in bowling! We finished the lesson by practising the correct way to adjust spacing between letters.</p><p>At the third meeting, we discovered together that the style of comic book heroes with black outlines originated in Japanese traditional woodcut and Art Nouveau posters. We shared a few tips on how to add dynamics to a story, work with bubbles, and made it clear that a comic should have a hero, a plot, a setting, and a consistent graphic style. Together we created a short comic strip on the theme of Surprise.</p><p>In the fourth lesson, we delved into packaging design, practised the skills we had acquired relating to colours and typography and put the little designers up against a difficult task - designing the packaging for a bag of sweets. Again, we followed the same procedure as we would in normal practice, starting with research and noting details and differences between, for example, fruit and chocolate sweet designs. The children showed a tremendous amount of creativity and, apart from designing the packaging, they also came up with ideas for the names of the new sweets - would you buy Chicken Beaks if you saw them on the shelves?</p><p>In the first four lessons, we deliberately avoided the computer and worked with designs on paper (again, the same as is the case in practice when creating designs). In the fifth and final lesson we tried to make our way to a graphic editor and introduced the children to <a href="https://www.figma.com/">Figma</a>. We tried to transfer the design for our bag of sweets into a computer. The children passed this last test, almost a trial by fire, and we are currently collecting all of their creations. </p><p><em>On behalf of <a href="https://qub.digital/en/our-work" target="_blank">QUB Digital</a> Ivana Stránská, Michal Hořava and Jan Čermák</em></p>|
|Graphics for children through the eyes of Viky||https://cleverlance.com/en/blog/Pages/graphics-for-children.aspx||Graphics for children through the eyes of Viky||<p>Twelve-year-old Viky wrote a great authentic report from a Cleverlance graphic design course for children</p><h3>1st lesson</h3><p>At the very beginning of the 1st lesson we introduced ourselves to the others as we do in other courses, but we also said what we wanted to learn. Once we had introduced ourselves, the lesson could begin. First they told us the colour of the year (which is called Very Peri) and how important it is for the designer. Actually, the designer uses the colour of the year almost everywhere. We also talked about the colour wheel, where you can see the contrast of colours beautifully. Then we learnt about the history of colours. It is very interesting that they were already using white paint in prehistoric times, because white is difficult to get and you even need the help of some chemicals to get it. And the Romans,for example, liked different shades of brown, so it was a romantic sort of style. Another topic was pigment. Depending on the binder you put in the pigment, different colours are produced. In the past, honey, oil or egg was used as a binder. For example, if you put honey in the pigment as a binder, you get watercolours or the same thing but with an egg, you get poster paints. The last topic was which different stones are used to make different colours. For example, yellow is made from volcanic stone or interestingly, white is made from black stone, although there is some chemical treatment, but that’s beside the point. At the end of the lesson, we were given “homework” to come up with our own colour palette for the next week. I enjoyed it very much and look forward to more graphic design lessons.<br></p><p><img src="/de/blog/PublishingImages/Articles/CreateIt/viki1.jpg" data-themekey="#" alt="" style="margin:5px;" /><br></p><h3>2nd lesson</h3><p>In in the 2nd lesson we talked about typography. First we discussed the history of writing. The very first writing was hieroglyphics, which were invented in Egypt.This kind of writing was time-consuming. Imagine if you had to draw a duck just to write a single word. Another type of lettering was invented by the Phoenicians and it was the first syllabic script, and from it came the Roman alphabet which we still write with today. One of the second to last topics was explaining what serif and sans serif fonts are and I am writing with sans serif at the moment. Also what uppercase and lowercase letters are. And the second to last thing we did was that they explained poster fonts and the ones they use in newspapers and so on. Poster fonts are meant to catch the eye and make an impression, but sometimes they are almost illegible. On the other hand, journalistic fonts must be easy to read. The last thing we did was that they sent us a link to a website chat room. We were able to practice placement of letters in headings and so on there. Like last time, we were given “homework” but this time we had to draw or paint our name (see picture in the header of the article). Again, like last time, I really enjoyed it and I’m looking forward to the next one.</p><h3>3rd lesson</h3><p>In the 3rd lesson we were dealt with comics. The very first topic we discussed was those sort of panels and we said that these panels can be arranged in different ways to keep the reader interested. Then there is that sort of shading in black and white comics, where for example the 1st panel is grey, the 2nd and 3rd panel is white and so on. Sometime around the middle of the lesson, we tried to draw our own comics, but just a strip. That means a comic with two or three panels. As soon as we had finished our “comic” the lesson was over. I enjoyed it a lot and look forward to the next one.<br></p><p><img src="/de/blog/PublishingImages/Articles/CreateIt/viki2.jpg" data-themekey="#" alt="" style="margin:5px;" /><br></p><h3>4Th lesson</h3><p>In the 4th lesson we painted and invented our own bag of sweets. First we had to come up with a name, I came up with “Japonky”. Then the font style and when the pencil sketch was ready, we came up with a colour palette. Once we had coloured the name, we did a design like an image of the flavour: watermelon, marshmallows, etc. with leaves around it or something else. We also had to put how much it weighs and what flavour it is (just in case). While we were doing this, Michal and Ivana told us about contrast and the golden ratio. We showed them our bags of sweets at the end of the lesson. They complimented us and told us to make an interesting background and take a picture of it for them. Like each and every graphic design class, I enjoyed it, plus we got to practice typography and choosing colours which go together in this one.<br></p><p>In the 5th lesson we worked with Figma. We redid the bag of sweets we did last time on a computer. First we set the paper format and set the colour. Then we did the headline and edited it. We also put in different shapes there and we found different vector images on a website. We were supposed to have it finished and sent by next week. The person who had the nicest one gets a bag of sweets. All of the graphic design lessons were great and I probably enjoyed the 3rd lesson the most. If there were more, I would definitely join in.<br></p><p>Stay tuned for graphic designs created by the other people who took part in the course in the near future - because who wouldn’t want to see the best-looking bag of sweets!<br></p>|