Archive for the ‘Uncategorized’ Category
Familiarity with Excel’s copy/paste capabilities will significantly improve your ability to get more done in less time. Copy/paste seems simple enough but the feature offers far more than the simple name suggests. In this article, guest blogger Nick Williams explores copy/paste in depth.Nick is an Access tutor based in the U.K.
Copy and pasting is always a useful way of moving data from one location to another, however, in Excel it can be so much more. The copy as picture and paste special options add some really useful functions, yet many people are not aware of their capabilities.
Copy as picture
The copy as picture feature does exactly what it says, allowing part of the worksheet to be copied and pasted elsewhere a picture. This is particularly useful for taking snapshots when working with data that updates frequently. It is also useful when pasting data into another program, for example, Word or PowerPoint, as when copied as a picture, your table or chart will look exactly as it does in Excel when pasted, saving the need to tidy it up. The copy as picture feature can be accessed through the drop down menu next to the copy icon on the Home ribbon.
The standard paste tool will copy both the design and contents of a cell or selection of cells. Whilst more often than not this is what most people want to do, there are times when the additional features of paste special are handy. It can be accessed through the drop down menu below the paste icon on the Home ribbon.
The options available upon clicking on paste special will vary depending on what’s currently stored on the clipboard. An image will not offer any choices beyond what type of image to paste it as, whereas a cell copied from elsewhere within Excel will offer plenty of options and is where paste special is most useful.
When pasting a cell or group of cells from within Excel, there are two main option groups; paste and operation. The various paste options allow the user to pick what features from the cell they are copying to paste. The operation option allows basic mathematical operations to be applied to cells.
The below example aims to demonstrate how some of the paste functions work. Column A simply shows the numbers 1 to 5 with one decimal place, with cell A7 summing these 5 numbers with a simple formula. Some basic formatting to the background colour and cell borders has been applied and there is conditional formatting in place giving cells with a value of less than 3 a bold, red font. There is also a comment on cell A4.
All – Copies everything in the cell; the content, visual formatting, number formatting and comments. The pasted cells in Column B of the example appear exactly as they do in Column A, other than the width of the column itself.
Formulas and Values – In the example above, the formulas and values columns look identical to one another. The numbers from Column A have carried over into Column C and Column D, however the visual and number formatting have not, nor has the comment. The difference between the two is the formulas are copied if Formulas are selected, but not when Values are selected i.e. the formula bar for cell C7 would show =SUM(C2:C6) whereas the formula bar for Cell D7 would simply show the number 10. Replacing formulas with numbers can be a good way of getting a snapshot of data in a spreadsheet that is updated frequently.
Formats – Selecting formats will see the visual formats carried over (including conditional formatting), but neither the data nor comments are copied with it. Number formatting is not copied, so if the number 1 was to be entered into one of the cells in Column E in the above example, it would show as 1 rather than 1.0.
Comments – This will paste only the comments, and not any of the formatting or data contained in the copied cell, a great time saver if the same comment needs to be used repeatedly throughout a spreadsheet.
Validation – It is difficult to show validation on the example as it works in the background. It does what it says, and pastes only validation rules from the copied sell into the pasted cell. If the cell being pasted into already contains a value that does not meet the validation rules, it will not be flagged up unless the cell contents are edited.
All using Source theme – This is difficult to show in the example as it only works if pasting into a different spreadsheet. As well as formatting cells individually, it is possible to set a theme for the whole spreadsheet which will control things like font and colour. When copy and pasting data from one spreadsheet to another, formatting set by the theme will not be copied unless this option has been selected.
Column widths – Pasting column widths allows the width of one column to be applied to another without affecting any of the data or formatting in it. Pasting column widths so that they are all equal can make a spreadsheet look much neater. In the example, Column H, where column widths have been pasted, is the only column with the same width as Column A.
Formulas and number formats – As with pasting a formula, this option will paste the formula from one cell to another. The difference here is that number format, for example currency or number of decimal places, will also be copied.
Values and number formats – Similarly, pasting values and number formats will keep the number formatting, but will convert any formulas to values.
All merging conditional formats – Although conditional formatting is pasted along with any other formatting, it will override any conditional formatting already contained in the cells. This option allows the conditional formatting in both the source and destination cells to be applied. In the above example, Column K had a conditional format applied telling it to make all cells with a value of less than 3 brown. Rather than overwrite this, both this conditional format, and the one in the copied cells turning the font bold and red have been applied.
The example below shows how the operations work within paste special. They will add, subtract, multiply or divide the value in the pasted cell by the value in the copied cell. In the example, Columns D to G originally contained the numbers 10, 20, 30, 40 and 50, as shown in Column C. When copying the number 10 from Cell A2 and pasting into Columns D to G using a mathematical operation, the number 10 is added, subtracted, multiplied or divided from the original number.
Skip blanks – This is something that may be used when copy and pasting multiple cells. If one of the copied cells is blank, Excel will not overwrite data in a pasted cell with a blank, rather it will leave the contents as they were originally, while overwriting any cells where the copied cells do contain data. Cell C4 in the below example shows how when a blank cell is copied and skip blanks is selected from the paste special menu, the original contents of the cell remain.
Transpose – Transposing data allows rows to be turned into columns and vice versa.
Paste link – Rather than paste the value or formula that is in a cell, selecting paste link will create a formula linking to the data from the copied cell, rather than the actual formula or data it contains. This is useful if the data is likely to change but has to appear in multiple places throughout a workbook.
Welcome back guest blogger, Nick Williams. In this article he discusses the intricacies of User Defined Functions. UDFs are arguably the single most important feature of Excel because it adds virtually infinite depth to Excel’s functionality, supplementing the already extensive list of ~350 built-in functions. The only limit on what you can do with UDFs is the depth of your imagination. Nick is an Access tutor based in the U.K.
Creating User Defined Functions in Excel
Most users are familiar with at least some of the functions built into Excel, for example SUM and AVERAGE. Whilst these are useful tools, there are times when they are not capable of performing some of the more specialised tasks that advanced Excel users require. Creating a custom function is a way of performing a calculation over and over again without the need to run a macro each time or create a series of complicated formulas.
Enabling the Developer tab
User defined functions are created in the Visual Basic Editor, which can be accessed from the Developer ribbon. If the developer ribbon is not visible, it can be enabled by going to the File menu and choosing Options. Select Customize Ribbon from the list on the left hand side, and ensure that there is a tick next to Developer on the list of tabs on the right.
Creating a new module
The Visual Basic Editor can be opened by pressing Alt+F11, or by clicking the Visual Basic Editor icon on the Developer ribbon. Once in the Visual Basic Editor, the first step towards creating a custom function is to create a new module. Although it is possible to add a function to an existing module, putting it in its own module makes it easy to export and reuse in another workbook.
To create the new module, from the menu click Insert, followed by Module. The module should be inserted into the current workbook by default. It is best to store the module within the workbook itself, rather than in the Personal workbook, as anything stored in the Personal workbook is only available to the current user on the current computer. If the file were to be opened on another computer, or even by another user on the same computer, errors would occur wherever the function has been used.
Writing the function
The code for the new function can be entered into the new module window. The actual content will vary depending on what the function is doing, however, it will always start with ‘Function’ at the top and ‘End Function’ at the bottom. The variables used by the function go in the brackets after the function name. The below example is for a very simple function that works out speed.
The first line declares the name of the function (SPEED) and its two parameters (Distance and Time). The ‘As Double’ after each of the variables is telling Excel what type of data to expect from these parameters, in this case a double-precision floating-point number. The second line is the code telling the function how to calculate the value it needs to return. In this basic example it is simply one line stating that speed is equal to distance divided by time. The final line End Function is telling Excel that is the end of the function and there is no more code to run.
In practice, most functions will much longer and more complex than this, as speed is something that could be easily be calculated using a simple formula. Although there is nothing wrong with making calculations in this way, especially if they are being performed over and over again within the document, user defined functions really come into their own when doing more complex calculations, for example where multiple if statements are required.
Using the function
Once the function has been created in the Visual Basic Editor, it can be called up in a workbook in the same way as any of the built in functions. The below shows how the speed of three people has been calculated using the newly created SPEED function. The formula refers to other cells to get the variables distance (Cell B2) and time (Cell C2) and uses these to calculate the speed.
Unlike built in functions, custom functions do not show what variables are required once the open bracket symbol is used. This can however be displayed by pressing Ctrl-Shift-A, however rather than displaying as a tip, the variable names will be entered into the formula bar as a prompt for what they need to be replaced with.
Saving the workbook
As the user defined function is created using VBA code, it will be necessary to save the workbook as a macro-enabled workbook with a .xlsm file extension. When it comes to saving, simply select this as the file type. If this step is not taken, or the workbook had been previously saved as a standard Excel workbook with a .xlsx extension, a prompt will be shown warning that the function will not be saved.
Importing and exporting functions
If a function needs to be used in more than one workbook, rather than rewrite it, it can be exported from the Visual Basic Editor, and then imported into another workbook. When in the Visual Basic Editor, click on the module to be exported and select File > Export file… from the menu. In the new workbook, open the Visual Basic Editor and from the menu select File > Import file… from the menu to import the function.
Trying to learn keyboard shortcuts can be intimidating; there are just so many. Like most problems, however, you can overcome the problem by breaking it down to bite-sized chunks. So if you haven’t been using keyboard shortcuts, start by learning and using just these five. Chances once you have used them only a few times, they will become second nature. Remember that some keyboard shortcuts require a combination of keys. That means you press and hold the first (second, and third, in some cases) and tap the last. As you will see, frequently the key to use makes logical sense but sometimes it doesn’t.
|<ctrl>A||Select the entire document|
|<ctrl>X||Cut the selected content or object(s)|
|<ctrl>C||Copy the selected content or object(s)|
|<ctrl>V||Paste what you have cut or copied|
|<ctrl>Z||Undo the last action you did|
These particular keyboard shortcuts (and many more, actually) have a consistent meaning throughout the Windows world. Whether you are working with a Word Document, an Excel Workbook, a PowerPoint Presentation, any other Office document, or Windows (File) Explorer, you use the same keyboard shortcuts.
Why bother learning these or other keyboard shortcuts? It’s all about time. To be sure, the time you save by using a keyboard shortcut once is insignificantly brief. But these shortcuts do things that you do frequently and repeatedly whenever you are working on your computer. Think of those tiny slivers of time as grains of sand on a beach. Save enough of them any you will have time for a beach vacation.
Links for the technically inclined:
Announcing the Biggest VM Sizes Available in the Cloud: New Azure GS-VM Series
Important announcement in Azure space
Announcing VP9 support coming to Microsoft Edge
New feature for Microsoft Edge
Unity 5.2 and Visual Studio Tools for Unity 2.1
Visual Studio is the new default Unity scripting editor on Windows
Announcing the Microsoft Azure Tour
The Microsoft Azure Tour is a free one day technical training event for developers and IT professionals to help you achieve more success with Azure.
Visual Studio Code and Visual Studio Online
Blog on the integration with Git
I needed to find a particular Word document that I wrote several years and several computers ago. Fortunately the file was one that I had migrated from computer to computer when I upgraded hardware. Yesterday morning, I was under a time-crunch to find the file. I needed the file and I needed it now before I headed off to my client’s site. The file was a set of supplemental exercises I had written for the course I would be training.
Quote of the Day
Silence is golden when you can’t think of a good answer.
I was pretty sure the file name included the word ‘supplementary’ and I knew that the word was included in a heading in the file itself. So I used ‘supplementary’ as my search term and, within a few seconds, search produced a list of results that matched the search term. With a click on the link, I had my file, and a smile on my face.
I’d like to claim that I invented Windows 7 like some of the characters in current Microsoft commercials do but I didn’t. Somehow, however, when I use Windows 7 features, it feels like the real Windows 7 designers had read my mind.
The database development process is straightforward but can only be efficiently carried out if the developer follows a set sequence of steps. The order of the steps is important because each step depends on the step before it. This series of articles has illustrated this sequence. Part 1 described the results of the data analysis step where we need to gain a full understanding of the data available to meet the client’s requirement. Of course, the design process illustrated here itself depend on a complete and accurate understanding of the client’s needs.
Even the best design in the world sometimes needs some revisions and tweaking when we start to develop queries and reports. Feedback from the Output phase of the process sometimes sends us back to revise and refine some aspect of the design.
The query at the end of part 2 is a good example of this. As is, that query can only provide the id of a representative, not the representative’s actual name. So we need to add a another table, tblHumanResources to the query. This change enables the query to provide representative names for the report.
Rather than including the three representative name fields in the query grid, the first column of the query is defined by a calculation: Representative: [hrlname] & ", " & [HrFName] & " " & [hrmname].
All that remains now is to design the actual report.
Building the detail section of the report is simply a matter of dragging and dropping fields form the field list in the correct position. The naming convention described in part 2 helps ensure that the fields are situated in correct product column and weekday Sales or Quotation row. While the field names obscure the actual product name.
Because there are multiple controls for each product in two different categories, individual addition expressions must be created in order to display weekly totals for both Sales and Quotations. For example the expression for the weekly sales total for Widgets is:
Here again, the naming convention make it easy to create the expressions for each product without a lot of typing. After creating the first control (for product 1 – widgets) and typing the expression as its data source, simply make five copies. Edit the data source for each product total in turn and replace the ‘1’ with ‘2’, ‘3’, ‘4’, ‘5’, or ‘6’, depending on the product. This technique will also work for the Quotations totals.
After adding all the controls, adding appropriate labels then formatting and aligning them, and setting up report grouping on Representative, this is what the final report looks like:
You should keep in mind that, because queries are limited to a maximum of 255 fields, this approach will only work if 255 or fewer ‘cells’ required in the report details. This report has 60 ‘cell’ (6 products x 5 days x 2 categories) so it is will within the limit.
In this article I will focus on the data structure behind the interleaved report. The data illustrated is only a small subset of what you would find in a typical full-scale application. In order to produce the report described in part 1, we need to know:
- who made the sale or quotation
- what product was involved
- was the action a sale or a quotation?
- when did it happened
These descriptions suggest the need for certain tables:
- product lines
- sales activity
- sales activity types
Here’s how the tables are organized in the database:
Obviously this is a minimal structure designed to illustrate what’s needed for the reports. The Human Resource table has only three name fields whereas in a real application much more information about people would be required.
The data is normalized to the third normal form. The problem here is to create a report from a flattened version of the data. The client’s requirements in part one specified that the report require one column for each product group sales on one row followed by a similar row or product group quotations. Keeping in mind also that the desired report should have some vertical space between each pair or rows and that there be one pair of rows for each day of the week, it should be apparent that a simple datasheet or even a continuous report will not meet these requirements.
Before designing the actual report it is first necessary to flatten the data. Crosstab queries are specifically designed to do exactly that. The data we will need in the query includes the sale representative, the date, the type of action, and the number of times the action happened on each day. So the basic crosstab design will look like this:
Notice that the query has included a field to restrict the results to a particular time frame (July 26 – 30, 2010). In this example the criterion has been hard-coded. In a working database the query would most likely refer to a user-specified time frame from a form or memory value.
What’s missing from this query, however, is any sort of specification of the activity type (sale or quotation) and the relevant day of the week. To further refine the specification, it is necessary to add two additional Where fields to the query.
Actually, in order to report on each product for each day of the week, a total of ten queries will be needed (5 days x 2 action types.) Before creating these queries, however, it will speed things up greatly if you specify column headings in the queries properties. (Remember that cross-tab data has to have pre-defined column headings if the data is to be used in a report.
This basic query is only needed as a model for the queries that will ultimately be used in the report. By setting the query up with as much generic detail as possible, you can quickly create the ten queries simply by modifying the product and day of week criteria and using save as repeatedly for each set of criteria. Subsequent work with the queries will be much easier if you use a consistent naming style for these day/product queries.
The names used in the example database are:
Sales queries: Monday to Friday
Quotation Queries: Monday to Friday
Notice how these names follow a consistent pattern. The names all begin with qxtbActivity, indicating that the query is a crosstab of the Activity tab. The next two characters after the first underscore indicates whether the query is retrieving sales or quotation data. The final two indicators indicate the weekday that data represents. Using consistent naming patterns like this simplifies both the creation and identification of the queries. For example, you can use a text editor to search and replace the distinguishing characters to create the other queries in the set.
These queries will return the sales and quotation date for each sales rep for each day on which each rep, respectively, had activity. However, in the unlikely event that a rep had no sales or no quotation activity, that rep will be missing from the days sales or quotation data or both. What we really need is for the rep to be included in the query output with zeros for each product column.
What we need is a list of sales reps who were active in the reporting period. A simple query of representatives in the activity table grouped by rep with a criteria limiting the date range will do the trick. The SQL of the query is:
WHERE (((tblActivity.ActDate) Between #7/26/2010# And #7/30/2010#))
GROUP BY tblActivity.Rep;
Now this query can be joined to each of the crosstabs to produce a set of ten selection queries. These queries all follow a similar pattern and use an outer join to ensure that all representatives will be included whether or not they have sales.
In these queries, the product names have been “aliased” so that they can be distinguished as to sales or quotations on a day by day. Again a consistent naming pattern helps speed up the creation process. For example, QUPr1_MO is the alias for Monday’s Widget quotations and SAPr1_MO is the alias for Monday’s Widget sales. The Pr1 simply indicates that this is the first field of products which is widget data. The queries themselves are names “qsel” followed by the part of the crosstab name after “qxtb”
For example, the query in the diagram above is named qselActivity_QU_MO, indicating that it will provide data for the Monday column of the report.
We need one more query to give us the data that the report will display. Think of this as an umbrella query that will combine the results of the ten queries just created into a single query:
The order of the sub queries doesn’t matter much but it will be easier to work with query when designing the report if the field list are arranged left to right, Monday to Friday. Each table should be joined to the table on its right by the Rep field. In this diagram the field list are arranged in two rows simply to make it easier to display in this article. In the actual query, qselActivity_QU_MO is to the right of qselActivity_SA_FR.
The final query has 61 columns – rep plus one column for each of the SAPr… and QUPr… fields.
In the final part of this series we will see how to display this query data in a grid format grouped by representative.
Since I “discovered” OneNote in the Office 2007 suite, it has become an essential tool for me on a daily basis. Microsoft introduced the application with Office 2003 but I completely missed it there.
All I can say is, if you haven’t used OneNote yet, what’s keeping you?
Yesterday, July 1, I received the notification that I have been recognized as a Microsoft Most Valuable Professional for the third year in a row. This is indeed a great honour, considering that there are fewer than 5,000 MVPs world-wide. One of the benefits of being an MVP is that we have a designated person (our MVP lead) at Microsoft) with whom we can correspond on a personal basis. You can read more about the MVP program on the MVP website.