Quantcast
Channel: Microsoft Dynamics 365 Community
Viewing all 53207 articles
Browse latest View live

Electronic Reporting: Import of GL Excel Journals (Part 3)

$
0
0
This third and last post on GL journal imports with the help of Electronic Reporting focuses on the import of so-called split postings where one debit account and multiple credit accounts (or vice versa...(read more)

Desktop Icons Suddenly Small

$
0
0

WindowsI’m posting this one because it is quite silly and I’m probably not going to remember it, if I don’t write it down.

I logged onto my main desktop the other day and all of the desktop icons were a lot smaller than usual:

Desktop with small icons

I had no idea how this had happened; I have syncing set up with OneDrive and had recently installed and configured an old laptop with a 1366×768 display and my main desktop is 2560×1440 and my initial thought was it was down to this mismatch. However, I couldn’t find anything which looked like it was the cause of the problem.

I resorted to an online search, but didn’t immediately find anything. I did stumble onto the resolution and this was probably what cause my problem initially.

On one web page, I zoomed in using Ctrl+mouse scroll wheel, but didn’t have my mouse over the browser window, but over my desktop. The icons on my desktop became even smaller; reversing the direction of scroll the icons became bigger again:

Desktop with normal icons

I’ve no idea if you’ve always been able to zoom the desktop or if it is new, but now I know, I can be more careful when zooming in on a website. and make sure the mouse cursor is over the browser.

Read original post Desktop Icons Suddenly Small at azurecurve|Ramblings of a Dynamics GP Consultant; post written by Ian Grieve (Lead ERP Consultant at ISC Software Solutions)

Integration With Dynamics 365 CE And Companies House API – Azure Logic App

$
0
0

Introduction and Background

Before I start the overview of this process, I think it’s worth noting that over the last few years, the options for integration, and to be fair, any sort of extension and development in Dynamics 365 has/is shifting from being a C# development only domain, or at least a coding exercise, to giving us the option to do this via an extensive toolset – Azure, Office 365 to mention a few.

So for this example – Integration of data from an external data source – I will be going through the process in what I consider the easiest and lowest overhead in terms of coding. Yes there are a number of different options for achieving this, and coding something is one of those, but my favourite expression at the moment is “Don’t crack a walnut with a sledgehammer”.

So a couple of assumptions in terms of your setup. You’ll need:

  • An Azure Account
  • Dynamics 365 Instance
  • A perfunctory understanding of API’s and JSON

The use case for this integration is to call out to the Companies House API and pull back some company data in to Dynamics 365. There is a lot of information stored there about Ltd Companies in the UK, and could be really useful.

The Companies House API is still in Beta……to be fair it’s been Beta for a couple of years, but seems fairly stable, and there is a wealth of documentation and a forum. You will need to create an account on the site in order to access and use the API’s.

You can do this, and access the documentation and Forum here: https://developer.companieshouse.gov.uk/api/docs/

The process is going to be something like this:

So one of the first things you’ll need to do once you have created an account on the Companies House website is register an application. This process then assigns you an API Key used for authentication. There a few options for type of API, and restriction of IP addresses etc, but for this example we’ll go for a REST API. Once this is completed, you’ll get an API key.

The next step is to setup you Azure Logic App. I’ll skip the bits around setting up Resource Groups etc. CLick on Create Resource, Integration then Logic App:

Add the details:

Once this is all done, you can start to build out your logic. When your resource is ready you can navigate to it. If you’re familiar with Microsoft Flow, you might recognise some similarities.

There’s a reason why Logic Apps and Flow look similar. Flow is built on top of Logic Apps. You can read more details about the similarities and differences here

Click on the Blank Logic App and add the trigger for Dynamics 365. In this example I am going to trigger the integration when a new Account is created, but it could also be when a record is updated and so on.

So the next stage is to use the HTTP connector set it up to point to the Companies House API. For authentication use Raw, and you’ll only need the value (no need for username and password)

You’ll also notice that the URI is appended with a Dynamics Field, in this case the Account Number field which needs to store the Company number, as this is the format for searching for the company. Obviously that means you need to know the company number!

Once the HTTP is set, you need to Parse the resulting JSON. For this you need the Parse JSON Operation. Here you will set the Schema for the returning JSON.

For the GET Company Schema use the following:

{
    "properties": {
        "accounts": {
            "properties": {
                "accounting_reference_date": {
                    "properties": {
                        "day": {
                            "type": "string"
                        },
                        "month": {
                            "type": "string"
                        }
                    },
                    "type": "object"
                },
                "last_accounts": {
                    "properties": {
                        "made_up_to": {
                            "type": "string"
                        },
                        "period_end_on": {
                            "type": "string"
                        },
                        "period_start_on": {
                            "type": "string"
                        },
                        "type": {
                            "type": "string"
                        }
                    },
                    "type": "object"
                },
                "next_accounts": {
                    "properties": {
                        "due_on": {
                            "type": "string"
                        },
                        "overdue": {
                            "type": "boolean"
                        },
                        "period_end_on": {
                            "type": "string"
                        },
                        "period_start_on": {
                            "type": "string"
                        }
                    },
                    "type": "object"
                },
                "next_due": {
                    "type": "string"
                },
                "next_made_up_to": {
                    "type": "string"
                },
                "overdue": {
                    "type": "boolean"
                }
            },
            "type": "object"
        },
        "can_file": {
            "type": "boolean"
        },
        "company_name": {
            "type": "string"
        },
        "company_number": {
            "type": "string"
        },
        "company_status": {
            "type": "string"
        },
        "confirmation_statement": {
            "properties": {
                "last_made_up_to": {
                    "type": "string"
                },
                "next_due": {
                    "type": "string"
                },
                "next_made_up_to": {
                    "type": "string"
                },
                "overdue": {
                    "type": "boolean"
                }
            },
            "type": "object"
        },
        "date_of_creation": {
            "type": "string"
        },
        "etag": {
            "type": "string"
        },
        "has_been_liquidated": {
            "type": "boolean"
        },
        "has_charges": {
            "type": "boolean"
        },
        "has_insolvency_history": {
            "type": "boolean"
        },
        "jurisdiction": {
            "type": "string"
        },
        "last_full_members_list_date": {
            "type": "string"
        },
        "links": {
            "properties": {
                "filing_history": {
                    "type": "string"
                },
                "officers": {
                    "type": "string"
                },
                "persons_with_significant_control": {
                    "type": "string"
                },
                "self": {
                    "type": "string"
                }
            },
            "type": "object"
        },
        "registered_office_address": {
            "properties": {
                "address_line_1": {
                    "type": "string"
                },
                "address_line_2": {
                    "type": "string"
                },
                "locality": {
                    "type": "string"
                },
                "postal_code": {
                    "type": "string"
                }
            },
            "type": "object"
        },
        "registered_office_is_in_dispute": {
            "type": "boolean"
        },
        "sic_codes": {
            "items": {
                "type": "string"
            },
            "type": "array"
        },
        "status": {
            "type": "string"
        },
        "type": {
            "type": "string"
        },
        "undeliverable_registered_office_address": {
            "type": "boolean"
        }
    },
    "type": "object"
}

If all goes well, when you add the final Operation – Update Record – You should see all the returned data options:

Obviously the fields need to be available on the Dynamics 365 Accounts form.

Then….. Time to test!

Add a new Account in Dynamics and include the Company Number:

Once you save the record, the call is made to the Companies House APi, and returns any details it finds:

So this is very much a click and configure integration, but I think a nice use of the Logic App solution, and an easy integration with REST API’s.

Changing Password Managers

$
0
0

Back in October last year, I posted a couple of articles around my recommendations and use of Two-Factor authentication. (Here’s part 1 and part 2 for those articles, if you’re interested in reading them.)

In my part 2 post, I indicated that I was also using a password manager and was using LastPass at the time. I have recently switched to 1Password and after posting a couple of things about that on Twitter, a few friends and followers expressed interest in knowing more about why I switched and how I would compare the two of them.

Brief background

I used Roboform for many years, but started finding it feeling a bit clunky. Yes, I realize that is completely useless as any viable feedback about a product, but that’s how I can phrase it. I just found the UI “tired” and was looking for something new.

So, about a year ago, I made the switch to LastPass. I wish I could remember more of my reasons for changing but I can’t any longer, they’re gone from my memory! LastPass appeared to have some features that Roboform didn’t (at the time, at least) so that’s what I chose. It’s quite likely that what I liked about LastPass is in the Roboform product today for all I know, as the “good” password manager products out there do keep adding new features to their products. I do consider all 3 products I’ve used to be good products, I just now prefer 1Password.

Why did I switch?

When I switched to LastPass, I looked at 1Password too at that time and, to be honest, I liked that LastPass had an “always free” version where 1Password only has a 30-day trial (after which it was a paid product). Fast forward to present day, and I’ve learned more about infosec generally in the last year (out of interest) and decided to make a change.

My (somewhat lame) first reason

The more I read about data breaches and infosec news, the more I am subscribing to the theory of “if you’re not paying for the product, YOU are the product”. This isn’t the “real” reason I changed, as I could have simply upgraded to a premium version of LastPass. It was just one of the things that made me re-think my decision as cost was a factor a year ago when I don’t think it should have been.

The key reason: integration with Have I Been Pwned

LastPass has a “security center” feature which, among other things, checks if any of your logins were compromised (in a data breach). 1Password takes this to the next level, in my opinion. It integrates with Have I Been Pwned (HIBP for short, Troy Hunt’s service around data breaches) in its Watchtower feature, in two ways:

  1. Checking your logins against the HIBP service to see if they’ve been in a breach. LastPass does the same thing, although I’m unsure if it uses HIBP.
  2. Checking your passwords against the HIBP’s Pawned Passwords service to see if they have been found in a breach.

I was specifically interested in the second feature above. I already subscribe to HIBP’s feature that will send you an email when one of your accounts is in a data breach once they become aware of it. I typically use only one email address that is only for logins, not for friends, family or other correspondence. Oddly enough, in the 15-20 years I’ve used this as a login, it’s only been in one known breach and that was the most recent one called “Collection #1”. That’s simply good luck, nothing I can claim I’m doing to avoid that situation. My business email was in the LinkedIn data breach years ago, which is an example of a site where the random login email account wouldn’t have been appropriate for business communications, and I’ve paid for that by it being in a breach and is on lists in god-knows-where on the deep dark web.

I have taken the time to ensure that nearly every site I use has its own unique, randomly-generated password so what I’m really interested in watching for now are instances where those passwords have been in a data breach. I’ll describe more below about this this works as it sounds like “their sending your passwords over the internet?”. No, they’re not. Now that I’m confident most of my logins are adequately secure and complex, if any of those appear in a data breach, I will know exactly what site it might have come from as my sites:passwords are a 1:1 relationship. If one of those passwords is in a breach, it doesn’t necessarily mean MY account was in the breach (others may have the same randomly-generated password), but the odds are it’s mine and I can go to that site and change my password.

Troy Hunt and HIBP

I have followed Troy Hunt on Twitter since I became a Microsoft MVP (when I first became aware of him). Troy is a fellow Microsoft MVP as well as a Microsoft Regional Director (more about the RD program here– think “Super MVP”!). He is the creator of the incredibly useful Have I Been Pwned service.

After listening to him speak as well as getting a chance to meet him last year at MVP Summit, I have taken an even more keen interest in infosec. As I have listened to his weekly podcasts over the past year, I realized I still have a lot of learn about it.

His HIBP service is a free service, which originally started out with checking if your email account was in a data breach. More about the site can be found here, in the About page, as well as looking in the FAQ page. Here’s a clip of the stats from the landing page about what’s in his databases at the moment.

The key take-away that is important to note here is the pawned logins are not stored along with the pawned passwords. If your email is in a data breach, there is no service where he tells you which password they used in that breach. Lots of people argue with him over that but the reasons are clear, and he’s blogged about it here.

For the Pawned Passwords feature, that’s interesting to me on a level where I don’t entirely understand the technical details but understand the concept. In 1Password, the passwords are hashed (not stored in plain text for anyone to read/see) and the service to check against the HIBP passwords feature sends the first 5 characters of the hash to HIBP’s API and it returns all of the hashes that match that prefix. 1Password then compares my hash to the list returned to see if I’m in it. No passwords are actually passed back and forth. Part of the hash is sent back and forth but not the whole hash so only the sender/receiver know the full hash to be able to know the full password at any given time.

His example is this: let’s say you have “P@ssw0rd” in your list. The hashed value of that is “21BD12DC183F740EE76F27B78EB39C8AD972A757”. 1Password sends “21BD1” and HIBP returns (at the time he wrote this example) 475 results, for the items that start with “21BD1”. If “2DC183F740EE76F27B78EB39C8AD972A757” is in that list, then the password is flagged as compromised. More details around this example are on this post.

Comparisons

Things I prefer in 1Password

It’s a bit too early to do a LOT of comparing but already I’m seeing things I like a lot better in 1Password than LastPass. Here’s the Watchtower “menu”:

To be fair, both products have features around re-used passwords, weak passwords and compromised logins. I described above why I was interested in the vulnerable passwords feature. I like the bottom 3 features as well, which I don’t recall an equivalent of in LastPass: insecure websites (HTTP not HTTPS), sites which have 2FA options that I’m not using and expiring passwords. I’ve already used the unsecured websites feature to update all of my URLs if the site was using HTTPS. Most were, but the old password manager happened to just have whatever site I saved at the time, many were the HTTP version.

This (above) is what appears on the Unsecured Website area or if I’m on a site where my login is in that list. It will actually not let me auto-fill that login or password (or so it seems). That I find a tad annoying. I’m fine being alerted to the issue but if the site isn’t secure and there is no option, I would prefer if it auto-filled my login and password for me anyway.

This is a snapshot of one of my vaults. Today the Unsecured Websites number is actually 2, which are both things I can’t change. I have several sites where I can enable 2FA, so that’s my next target. I don’t have any compromised logins, weak passwords or vulnerable passwords so that’s a good thing.

Another thing I noticed that I like is a choice of type of random password generation, other than the typical length and whether to include numbers or symbols. 1Password gives you a “memorable password” option which sounds dumb on the surface, but it’s helping creating passphrases that will be better and easier to remember while still be lengthy enough to be a security improvement.

Lastly, I like that you can have different “vaults” – groups of saved logins etc. I can segregate ones for work from personal ones and have actually created a vault for sites I no longer use, but want to keep the details in case of future data breaches. That keeps my active logins list a lot smaller and more manageable.

Similar features implemented in different ways

Various sites I subscribe to have different URLs but the same login and password. Amazon.com and .ca for instance. GPUG.com and all of the other UGs are another example. The way these work in LastPass is via an “Equivalent Domains” feature which is a place where you can say “Amazon.com and Amazon.ca are equivalent domains”. That allows you to have 1 login saved, and thus avoids the “you have 2 logins with the same password” problem.

In 1Password it appears that on a login, you can list different URLs which is a different way of implementing the same feature. I quite like this because you can also label it if you need to, other than each being called “website”. I didn’t, in the example above, but I could label them Canada, USA or whatever makes sense for the different sites using 1 login/password combination. In this case, they are linked so you can’t have different logins for the sites. If I have a choice not to use the same login & password combination, I would rather do that and separate the logins. With some sites, they are related but utilize the same authentication in the back-end so you don’t have that choice.

One LastPass feature I’ll miss

The one feature I’ll miss that I can’t find on 1Password was something I don’t know the name of but I could lock some passwords in and require that I re-type my password if I used them. I put that on my banking sites, anything work-related, and anything with financial info saved (sites with credit cards you can’t remove) and things like that. If that’s available in 1Password, I’ve yet to find it. It was a nice 2nd layer to force me to authenticate a 2nd time before using that particular login.

Last thoughts

That’s pretty much it for this post. I think what I may do for a #TipTuesday is a write a brief primer on how to get started with using a password manager if you don’t already have one. There are lots of people who don’t use one and use far less secure ways of tracking their passwords, either in an unsecured note on their phones or computers, or even worse, re-use passwords to reduce what they have to remember. A password manager and working through your logins to re-generate passwords to be more complex is a much safer, better alternative!

Retail Enterprise Architecture mapping using ArchiMate and ARDOQ

$
0
0

The warning; The blog post is High Level, but the benefits can be mind-blowing.

Enterprise Architecture is about understanding and change. In today’s business, change is everywhere and the essential part to survive. But change is not easy. To have insights and understanding of your own organization is essential for change and risk assessment. Understanding how people, processes and technology are connected will give focus to achieve high value benefits. In my profession we use the Microsoft Dynamics technology stack as a main driver for implementing improvements. But we also acknowledge that Dynamics 365 is not the only system at work. Even though Dynamics 365 is a central component, there always will be many other systems, processes and technologies that is included in the enterprise architecture (EA). We need a way to describe all these connections in uniformed way, that allows us to communicate a model for enterprises dynamically.

But why should EA mapping be a central part of your business? here are 6 business motivators and benefits of having a structured approach of the EA mapping:

 Increased stability and availability. It is critical vital that all central systems have a near 100% availability. POS and back-end systems must always work, and the supporting processes must be streamlined to secure that risks related to business improvements and changes are minimized and understood. The EA mapping documents the relationships and show consequences changes.
 Guaranteed Performance. Having acceptable system response 24/7, that can deal with business pikes must be planned and built around the system. Systems must deal with a variable load, handling that the sudden event changes the transaction volume. Any disruptions quickly result in customers walking away. The EA mapping must document components central for performance compliance, and the business actors involved
 Scalable capacity. New stores or changes in the business model can quickly change the requirement for transaction and processing capacity. To be cost effective, the capacity scalability must dynamic according to the actual need. Both in terms to scaling up and down. The EA mapping documents components central for scalability, and the business actors involved.
 Strong security. Cyberattacks are increasing and it is vital important to secure information and transactions. Being GDPR compliant puts demands on systems and internal processes on how to handle own and customer information. Security, tractability and audit trail builds trust into the system and documenting compliancy. The EA mapping documents governance and role compliance, and the business actors involved.
 Right focus. There are always new business opportunities and process improvements. Keeping track on where to focus will lead to better and faster implementation of changes in a secure and stable manner. New ideas must be analyzed, and risk assessed, and also to understand the implications. The EA mapping can assist in focusing on what changes have the highest priorities and benefits.
 Cost control. Being a retailer involves large investments in technology like POS, Mobile apps, customer portals and enterprise systems. Moreover, there may be large fluctuations in system usage throughout the year. By purchasing these features in the subscription form, it is possible to equalize the operating costs and that you only pay for what is needed. Good liquidity is archived by balancing cost full investments towards the revenue stream and securing actual return on these investments

To move forward a “language” is needed to describe an enterprise architecture model where you can visualize, plan, implement and maintain all relationships that exists today, in transitions and the final vision.

Architecture Layers using ArchiMate

The overall mapping can be modelled in 5 main layers; Here I would like to focus on the symbolism used for identifying. The notation here is ArchiMate, that is open and independent enterprise architecture modeling language to support the description, analysis and visualization of architecture within and across business domains in an unambiguous way.

Motivation Elements defines the overall drivers and goals that the enterprise have. Much of the vision is located here. The Motivation elements can also be seen as a vertical layer, in close relationship to all layers.

The Strategy layer defines the overall course of action and a mapping towards resource and business capabilities.


The Business layer defines the business processes and the services the enterprise is providing, and the here the main business processes are defined. To simply the modeling it is relevant to start with the Business Objects, Business processes, Business Roles, Business actors, Business events, Business Services and Business Rules and Logics.

The Application layer contains application services and capabilities, their interactions and application processes. Here Dynamics 365 and much of the power platform is located. To simply the modeling it is relevant to start with Data objects, Application functions and Application components.


The Technology and physical layer describes the software and hardware(physical or virtual) capabilities that are required to support the deployment of business, data, and application services; this includes IT infrastructure, middleware, networks, communications, processing, standards, etc. The underlaying structure of Microsoft Azure would typically be described here. To simply the modeling it is relevant to start with Artifacts, System Software, Technology Service, Device and Communication network.

Architecture Relationships using ArchiMate

The real beauty comes, when the relationships between architecture elements are being defined. But to do this, a set of predefined relationships needs to be defined. The most common used is the following one

If putting this together in a combined setup I get the following relationship diagram of what is relevant to document.

(*Credits to Joon for this visualization)

As seen here, the business processes are a realization of the application functions, and this clarifies how a proper Enterprise Architecture modelling is documents. With this model, we can what business actors is assigned to what Business roles. This again shows the business process assignment to the role. The Business processes are there to realize business services.

Building the Architecture model using Ardoq

The architecture relationships can be challenging to describe using tools like Visio. Often, we see that great work is done, but not used to the potential. An alternative is to use cloud based mapping tools as ardoq, that covers most aspects in documenting relationships between business processes, applications, roles, risks and transitions. This is not a commercial for this tool, but I find it great. So, I decided to try to use Ardoq to model the Contoso demo data.

Here I will focus on the Application Layer, as this is the layer where the application functionality and data are located. First, I create the application components:

Then I create the Application Functions, and I also import the Business Roles that is available in the Contoso demo dataset.

Next job is to build the relationship between the application functions(D365), business processes(vertical processes) and business roles. This will allow me to visualize and to trace dependencies across all the EA mappings. Let’s take an example looking into the responsibilities of an employee named April Mayer.

I can here see that she is related to the business roles; Accounts payable clerk and manager. If I click on the “Accounts payable clerk” I jump into the view of this business role, and I can see that it is related to the business processes of accounts payable, and an association to April Mayer.

Jumping to accounts payable allows be to see the business processes involved.

I can also visualize the entire Enterprise Architecture Map will all objects and relations,

And zoom into specific on the relations; This graph shows me that April Meyer belongs to the role “Employee”, Accounts payable manager and clear. The Accounts payable clerk is associated with the business process “Accounts payable”. The clerk role is associated with the Financial management modules in Dynamics 365.

Here is another visualization, that shows the how the business objective of “Marketing” can be achieved, and what Business roles are involved, what Business processes, Application functions and what application components are also involved.

Knowing the relation and the ability to communicate is a key to happy Enterprise Architecture mapping.

Give is a try, the result can be very powerful.

Additional information

1. A high value blogger on Enterprise Architecture is http://theenterprisingarchitect.blogspot.com/.

2. Homepage of archimate: http://pubs.opengroup.org/architecture/archimate3-doc/toc.html .

3. Homepage of ARDOQ : https://ardoq.com/ Give it a try !

Setting up the test environment for Microsoft Power Apps

Let your Operations Flow – Part 2

$
0
0

My previous blog was an introduction how to start using Microsoft Flow with the Microsoft Dynamics 365 for Finance and Operations connector. In this episode, I will elaborate on the Create a record action. How can you use this action and I will provide some related tips. E.g. how use line numbers which should be incremented?

Create record action

The template which I used in my previous blog used the create record action also. A screenshot of the basic part shows the most important fields for the Vendors entity.

As you can see, you have to specify the environment as well as the entity to use. When these to values are specified, you will see a list of mandatory fields (marked with red asterisk). When  the entity is company specific, you will also see a dataAreaId field. The Vendor account field in this example is mandatory in Microsoft Dynamics 365, but not in the entity. If you provide a value, it will be used as vendor ID. If you leave it blank, it will use logic to retrieve a number from the number sequence.

Use the create record action

When you want to create a Flow, you always have to start with a trigger. Triggers can be based on an event like a new mail message has been received. The previous blog used a recurrence. You can also install Flow on your mobile device and start Flows with a button.

Let your Operations Flow Part 2

When the trigger has been inserted, you can continue with actions. Usually, you would read content from a certain application and bring data automated to a destination. One destination is the action Create record on the Dynamics 365 for Operations connector. When you add a new action, you can search with keywords for the correct action. You can use e.g. OperationsDynamics 365 or a combination to narrow down the results. At the time of writing this blog, there is a trigger for Dynamics 365 for Finance and Operations in preview. I will elaborate about it in a future blog post.

Let your Operations Flow Part 2

When you inserted the Create record action, you can choose the correct environment. Usually you will first choose a test or demo environment before actually use a production instance. The second required field is the entity. Flow will list all public entities. Some names are quite easy to find, like Vendors. When you don’t know the name of the entity or cannot guess the correct name, you can open the form in Dynamics 365 and use the function Open in Excel. When the Excel sheet has been downloaded, you can then look in the design mode which entity is used.

Let your Operations Flow Part 2

In some cases, there are multiple entities for the same table, but there is a version difference. For the vendors, there is a V2 version available. It is recommended to use the highest version number. Older versions will get depreciated in the future.

In some cases, you get a error when typing an entity name. The connector is trying to find the correct entity as you type. When you have only a part of a keyword, it will mention that it could not find the entity. It is a bit annoying, but you can ignore this error when you are still in the process of selecting your entity.

Let your Operations Flow Part 2

When you have selected the correct entity, there will be some mandatory fields listed. Then a field for the company (dataAreaId). It will then have a key field in case it is not mandatory. In this example, the vendor account number is not mandatory, but it is mandatory within Dynamics 365 itself. The reason for not having it mandatory is that you can leave this field empty. In that case, it will try to pick up a number from the number sequence. If you did not provide an account number and the number sequence is set to be manually, the flow execution will result in an error.

Let your Operations Flow Part 2

To be able to use all other fields, you can click the option Show advanced options. It will list all other attributes in a non-alphabetical order. It would be nice if the field list has a certain structure, but in the current way, it is possible to map all fields.

Let your Operations Flow Part 2

When you completed the flow, you can test and activate your task.

How to use line numbers in Flow?

A common question is how to use line numbers in Flow? They are not automatically filled when you are using Flow or other Odata based tools to insert records. Flow can support in incremented integers by using Variable actions.

Let your Operations Flow Part 2

Directly after the trigger, you can add Initialize variabele actions. You need to declare all variables that will be used in your Flow. For line numbers, you can use the IntegerType. The first value can be set to the Value1 (or another value according to your preference).

Let your Operations Flow Part 2

For a flow, e.g. importing purchase order lines from an Excel file, you can use the line numbers. There could be an option to increment with 1, 10 or other steps for the line numbering. How to use the line numbering an increment the value for the next line can be seen in the next screenshot.

Let your Operations Flow Part 2

The purchase lines create record action has the variable LineNumber mapped as input for the LineNumber field. Then after the create action, the LineNumber will be inremented with the Value 1 in a Increment variable action. In this way, you can use line numbers provided by Flow if the business logic in Dynamics 365  for Finance and Operations is not automatically populate the Line number field.

Use values from other Dynamics 365 actions

When you look at the screenshot above, you can notice that the purchase order number and data area fields are filled with Dynamics 365 fields. In this Flow, I did use a purchase order header entity to insert a new header record first. When the purchase order has been inserted, you can re-use the values in the Flow for other entity actions or providing a notification which order was inserted.

 

The next part (3) will inform you about the Dynamics 365 for Finance and Operations update action in Flow.

 

That’s all for now. Till next time!

 

 

Microsoft Dynamics CommunitySubscribe to this blogger RSS FeedMy book on Merging global address book records in AX 2012

 

The post Let your Operations Flow – Part 2 appeared first on Kaya Consulting.

#GPPT GP Power Tools Build 26 is coming soon

$
0
0
As mentioned in my previous post I have been working hard for the last few months on the development of the next build of GP Power Tools . Build 26 of GP Power Tools is already at release candidate stage...(read more)

Get Fields From Related Records Using Actions

$
0
0
Have you ever had that requirement where you know you can solve most of it with a workflow but the information you are after is one relationship too far away? For example: I have a custom entity called...(read more)

Register Now for our Live Webinar – Manage Dynamics 365 CRM documents on SharePoint and Dropbox – The Windows Explorer Way!

$
0
0
As you must have noticed, managing documents in Dynamics 365 CRM on Cloud Storages is yet to evolve. SharePoint is the only Connector with native integration in Dynamics 365 CRM however, there are not...(read more)

sync-navtenant : The operation could not complete because a record was locked by another user. Please retry the activity.

$
0
0

Meine letzten 4 Tage musste ich leider als länger geplant mit dem Ungrade auf CU3 verbringen. Nachdem alles Schritt-für-Schritt abgearbeitet wurde, wollte die Schema Synchronisation nicht so richtig funktionieren und hat sich jedes Mal mit folgendem Fehler beendet.

Error: Sync-navtenant: Der Vorgang konnte nicht abgeschlossen werden, da ein Datensatz durch einen anderen Benutzer gesperrt wurde. Führen Sie die Aktion erneut aus.

sync-navtenant : The operation could not complete because a record was locked by another user. Please retry the activity.

Im Internet gibt es dafür bereits einige, wenige Vorschläge, welche aber leider bei mir nicht zum Erfolg geführt haben. Folgendes habe ich versucht:

  • dutzende male kompilieren (in C/Side werden auch Tabellen geblockt und die Blocks lösen sich danach auf)
  • SQL Statements für Retention und Change Tracking ausgeführt.
  • Systemtabellen nochmals ersetzt.
  • Test-SchemaSync in Powershell ausgeführt und alle Problem behoben ..

Leider hat nichts davon geholfen und nach 3 Tagen kam mir die gute, alte, klassische Idee.

Meine Lösung

  • navdata File in der alten DB erstellt.
  • CU3 installiert.
  • neue leere Datenbank in CU 3 erstellt.
  • per PowerShell neuen, leeren Mandanten erstellen
  • navdata File importiert.
  • navsync-tenant gestartet

Also, nicht besonders charmant, aber hilfreich.

Anmerkung

Bei der Gelegenheit fiel mir auf, dass das NAVAdministration.psm1 Module auf der DVD noch auf das alte Installationsverzeichnis (../Program Files (x86)/Dynamics NAV/.) zeigt. Das neue Installationsverzeichnis ist aber Dynamics 365 Business Central. Falls ihr das Modul auch verwendet. Müsste ihr dort den Pfad anpassen und das Modul in Powershell neu laden.

LG Rene

My Solution

  • Create .navdata file from old database
  • Install Cu3
  • Create new, empty database with CU3
  • Create a new empty company (you can delete it later)
  • Import .navadata file
  • Start sync-navtentant

Der Beitrag sync-navtenant : The operation could not complete because a record was locked by another user. Please retry the activity. erschien zuerst auf Dynamicsblog.

Dynamics 365 Cloud Insight Workshop – in 5 Tagen zum #Cloud Shooter

$
0
0

Dynamics 365 Cloud Insight Workshop – in 5 Tagen zum #Cloud Shooter. Mitte März ist unser Dynamics 365 Insights Bootcamp geplant. Business Central, AL, Sales, Integration und Office 365. Alles auf einen Schlag. Für unsere Learn4NAV / Learn4D365 Abonnenten haben wir außerdem ein großartiges Preis/Leistungsverhältnis erzeugt. Mehr dazu findet hier auf unserer Webseite.

Dynamics 365 Cloud Insight Workshop – in 5 Tagen zum #Cloud Shooter

Tag 1 Dynamics 365 Business Central
Tag 2 AL und Visual Studio Code
Tag 3 Dynamics 365 for Sales
Tag 4 Integration Dynamics 365 Business Central / for Sales
Tag 5 Office 365 Integration

Inhalt
In diesem 5-tägigen Workshop erhalten Sie eine Einführung in die Verwendung der Anwendung Business Central in der Cloud. Um die Anwendung optimal einsetzen zu können lernen Sie im Anschluss die Möglichkeiten der Anpassungen kennen. Dynamics 365 Business Central stellt ihnen einerseits die ERP Funktionalität zur Verfügung, bietet aber zusätzliche eine Integration zu Dynamics 365 for Sales. Am 3ten Tag lernen Sie daher die Basiskenntnisse der Cloud Anwendung Dynamics 365 for Sales kennen, um im Anschluss die beiden Anwendungen miteinander zu integrieren.

Der Beitrag Dynamics 365 Cloud Insight Workshop – in 5 Tagen zum #Cloud Shooter erschien zuerst auf Dynamicsblog.

Tip #1217: Use accounts with web roles in portals

$
0
0

In case you were wondering, we are still around even though it’s been awfully long time since our last tip. Festivities season was upon us but the real reason was that we were just too darn busy. That’s good, right?

Now, we are back and, of course, with a bunch of tips. Now, brace yourself. I’ve been working with the portals for the past few weeks. What it means is that tips about portals are going to dominate our feed for the next N days.

Ready? Let’s go.

O Accounts, where art thou?

We’ve been repeatedly told that an authenticated portal user is always associated with a contact, regardless of the authentication method used. Once upon a time we could use accounts but not any more. So, good bye, accounts? Not so fast.

Turns out, when a user signs in, they will inherit any web role assigned to the parent account. Boom. That little known feature actually allows very effective management of the base permissions on a portal for your customers or partners. Instead of assigning and managing web roles for thousands of users, you can now assign and manage roles for hundreds of accounts instead.

If contacts have a parent account that happened to be a partner, assign Partner web role to the account and, presto, all contacts under the account will now have Partner role after signing in. Partner level was dropped to Silver from Platinum? Just replace the web role in one place and all 27 contacts under that partner are good to go with demoted permissions.

(Cover photo by Mikkel Bergmann on Unsplash)

Implementing SmartConnect: SmartConnect vs. Integration Manager

$
0
0

eOne SolutionsThis post is part of the series on Implementing SmartConnect, an integration tool from eOne Solutions, which can take data from any source and integrate it into Microsoft Dynamics GP (and other systems such as Microsoft Dynamics CRM or Sales Force amongst others). It has a drag and drop interface to make creating integrations quick and easy for all users rather than just developers (as many integration tools target).

For providing clients with integrations, we’ve typically used Integration Manager, which is part of the Customisation Pack. Having recently taken a look at SmartConnect from eOne Solutions, I rapidly came to the conclusion that while Integrating Manager is easy to use and integrations are relatively simple to create, it lacks a lot of the features of SmartConnect.

FeatureIntegration ManagerSmartConnect
Easy to create integrations
Schedule Integrations to run automatically 
Use Excel XLSX as a data source 
Connect to almost any datasource 
Integrate to any eConnect node 
Integrate with custom eConnect nodes 
Extend integrations programatically VBA VB.NET or C#
Chain integrations to run one after the other 
Schedule export of data to other systems 
Robust when integrating large datasets 

Click to show/hide the Implementing SmartConnect Series Index

Read original post Implementing SmartConnect: SmartConnect vs. Integration Manager at azurecurve|Ramblings of a Dynamics GP Consultant; post written by Ian Grieve (Lead ERP Consultant at ISC Software Solutions)

Implementing SmartConnect: SmartConnect vs. Integrating Manager

$
0
0

eOne SolutionsThis post is part of the series on Implementing SmartConnect, an integration tool from eOne Solutions, which can take data from any source and integrate it into Microsoft Dynamics GP (and other systems such as Microsoft Dynamics CRM or Sales Force amongst others). It has a drag and drop interface to make creating integrations quick and easy for all users rather than just developers (as many integration tools target).

For providing clients with integrations, we’ve typically used Integration Manager, which is part of the Customisation Pack. Having recently taken a look at SmartConnect from eOne Solutions, I rapidly came to the conclusion that while Integrating Manager is easy to use and integrations are relatively simple to create, it lacks a lot of the features of SmartConnect.

FeatureIntegration ManagerSmartConnect
Easy to create integrations  
Schedule Integrations to run automatically 
Use Excel XLSX as a data source 
Connect to almost any datasource 
Integrate to any eConnect node 
Integrate with custom eConnect nodes 
Extend integrations programatically VBA VB.NET or C#
Chain integrations to run one after the other 
Schedule export of data to other systems 
Robust when integrating large datasets 

Click to show/hide the Implementing SmartConnect Series Index

Read original post Implementing SmartConnect: SmartConnect vs. Integrating Manager at azurecurve|Ramblings of a Dynamics GP Consultant; post written by Ian Grieve (Lead ERP Consultant at ISC Software Solutions)


Users Not Disabling in Dynamics 365 v9 Online

$
0
0

To disable a user in Dynamics 365 online, we can remove their Dynamics 365 licences on Office Admin Centre. However, we recently had an issue where we removed all the licences for a user that included Dynamics 365 in Product licenses list and yet the user was still enabled in Dynamics 365 instances.

Upon doing some research we found that as part of the expansion of the Common Data Service, it now contains Dynamics 365 Customer Engagement platform too. Therefore, we had to remove the Common Data Service licence off the users too.

Removing both Dynamics 365 licences and Common Data Service licence from the user solved the issue. The users were no longer enabled in Dynamics 365.

image

Importance of Territories in Microsoft Dynamics 365 Field Service

$
0
0
Hi Everyone, Today i am going to share some important points about the Territories in Microsoft Dynamics 365 Field Service while implementing the project. Territories: How to create Territories in Field...(read more)

MB6-898 Enable and use Worker actions and Position actions

$
0
0
To use personnel actions you first need to activate worker and or positions actions in the human resources shared parameters -> personnel actions. Human resources shared parameters personnel actions...(read more)

Create Leads from a website using Logic Apps

$
0
0
Case Study: you have a company site hosted on a CMS (let’s assume WordPress for the purpose of this). You want to capture inquiries from the public, and create Leads in your Dynamics 365 for Sales. ...(read more)

Microsoft Dynamics GP, What work was posted today?

$
0
0

Originally Posted 3/29/12

Image from Microsoft Dynamics GP 2018

Use Microsoft Dynamics GP SmartList and add the “Posted Date” or “Originating Posted Date” to your object. This will be the actual system date (the date in the bottom right-hand corner of your PC) or the actual date. This is a great way to see what was performed for the time period in question.

Another great rea/son to look at this field is for problem-solving. Imagine that your checkbook (bank reconciliation) is in balance to your GL yesterday, but today it’s not. Use Smartlist to see what was ACTUALLY posted today regardless of the GL Posting or Transaction date. That’s where your problem will be.

Viewing all 53207 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>