Quantcast
Channel: Dynamics 365 Blog
Viewing all 350 articles
Browse latest View live

How Microsoft Dynamics 365 for Field Service impacts the bottom line

$
0
0

A new study by Forrester Consulting calculates the potential return on investment (ROI) for implementing Microsoft Dynamics 365 for Field Service can help organizations realize benefits of $1.8M over three years, adding up toa net present value (NPV) of $1.4M and an ROI of 363 percent.

The Microsoft-commissioned Total Economic Impact (TEI) is based on interviews with three long-term telecom, construction, and manufacturing customers usingMicrosoft Dynamics 365 for Field Service. Prior to deploying Dynamics 365 for Field Service, these organizations were evolving from a bare-bones approach to field service operations. As they grew, it became evident that a more proactive approach was needed that could extend existing resources further.

Key findings

  • Through increased transparency and efficiencies into field operations, field teams reduced hours billed for repair and maintenance work orders by up to 60 percent. Over three years, the associated cost savings reached a present value (PV) of $1.4M.
  • By enabling IoT and remote device access, organizations eliminated field dispatch for over 10 percent of total work orders with subsequent labor cost savings, totaling a three-year PV of $360K.
  • Efficiencies from optimizing or eliminating travel were not limited to fewer labor hours billed. Field workers slashed time driving to work sites by 50 percent. The fuel savings over three years amounted to a PV of $28K.
  • By leveraging automation, call center agents reduced service calls by a minimum of 20 percent with a three-year PV of $13K.

But thats not all. There were many unquantified benefits such as heightened customer experience and satisfaction. These companies also reduced overtime and avoided regulatory penalties while improving hardware quality.

The bottom line

Forrester took a deep dive into the customer journey of these organizations, sharing their challenges, their vendor selection process, and the results of their investments. All three customers agreed that Dynamics 365 for Field Service provided:

  • A modern and customizable field service management solutionright out of the box.
  • Increased field utilization and improved visibility across the entire field service landscape.
  • Greater operational efficiency including reducing the number of repeat visits, service reminder and survey calls, and technician transportation costs.

Learn how Microsoft Dynamics 365 for Field Service can modernize, optimize, and energize your field service organization.

The post How Microsoft Dynamics 365 for Field Service impacts the bottom line appeared first on Microsoft Dynamics 365 Blog.


Retail webinar for June

$
0
0

The monthly Retail webinar held on June 6 was filled with some great content and attended by over 80 participants from around the globe.

We started with a demo of extensibility enhancements during which the new application programming interface (API) triggers were used with the point of sale (POS). After that, we discussed details of the Regression Suite Automation Tool (RSAT) for POS, which enables retailers to perform regression testing on POS to ensure new enhancements don’t break existing business processes. We then covered the statement posting enhancements, which were added to improve posting performance and provide new validation rules for transactions. This was followed by the product search and discoverability enhancements, where we showcased faster product searches and new refinement capabilities. The last part of the webinar provided insights into the upcoming task management capabilities, which will ensure efficient store execution. We also showcased our vision for simplifying the management of the button grids displayed on POS.

Weve posted a recording and the presentation materials from this last webinar, as well as information about the upcoming webinar on theRetail Interest GroupYammer group. This Yammer group is quite activeused by implementation partners, support, customers, and product group members to post questions and commentsallowing members to take advantage of the collective knowledge of the Retail community. To join this group, go to the Yammer site to send a request –http://aka.ms/retailinterestgroup.

The next webinar will be during the second week of July. Were looking forward to seeing you all there!

The post Retail webinar for June appeared first on Microsoft Dynamics 365 Blog.

Essential tools to create a better workplace – Microsoft at SHRM 2019

$
0
0

This week, Microsoft will be at the SHRM 2019 Annual Conference and Exposition to share best practices on how to unlock talent success, connect to collaborate, and drive culture change. The event theme, creating better workplaces, aligns with our focus on creating a modern workplace where people thrive.

In recent years, weve seen an exponential growth of digital connectivity, devices, technology, and data. And yet, our workplaces and workforces still operate according to the early 20th-century assembly line models of repetition, efficiency, and scale.

Modern workplaces are more than just exceptional places to workthey are cultures where people are inspired and empowered to innovate, create, and collaborate. Achieving this vision requires human resources to build a modern workforce, bringing people together to pursue their passions and realize their purpose. A modern workforce is made up of connected people.

Microsoft is helping HR leaders to digitally transform the employee experience to create a workplace where people can thrive. As SHRM 2019 attendees will learn, organizations can break down barriers that prevent talent success with the tools needed to connect and engage with employeesnot only in the office, but all over the globe.

Microsoft is helping human resources professionals to accelerate success with three solutions showcased at SHRM 2019:

Connect to collaborate with Microsoft 365

Boost productivity and empower employees by giving them the modern tools to create and innovate, enabling them to deliver amazing results and do their best work together. Microsoft Teams is the hub for teamwork in Office 365; a shared workspace where you can chat, meet, share files, and work with business applications.

Explore Microsoft 365

Drive cultural transformation with Workplace Analytics

Cultivate success with an engaged and efficient workforce using insights from everyday work in Office 365. Empower teams to collaborate effectively, free up time to work on their most important work, and create a healthy work-life balance. Identify and promote the right skills and behaviors to meet organizational goals, develop careers, and grow new leaders.

Get intelligent insights with Workplace Analytics

Unlock talent success with Dynamics 365 for Talent

Hire, develop, and retain people who can deliver impactful results. Microsoft Dynamics 365 for Talent empowers you with tools to land top candidates and accelerate their success. Meanwhile, your employer brand will stand out thanks to a fast, seamless experience thats optimized for mobile and integrated with LinkedIn, the worlds largest talent marketplace. With Dynamics 365 for Talent, you will develop a nuanced understanding of your employees and capture data to improve their experience.

Learn more about Dynamics 365 for Talent

Attending SHRM 2019? Connect with Microsoft!

Visit Microsoft at SHRM 2019 Annual Conference and Exposition.

If youre attending SHRM, we invite you to stop by our booth to speak with experts from Microsoft 365, Workplace Analytics, and Dynamics 365 teams, get a free professional headshot, and learn how were digitally transforming the employee experience.

We also invite you to attend two sessions hosted by our team:

Test your Microsoft for HR Tech IQ
Sunday, June 23 | 5:45 6:15pm | Exhibitor Solutions Theater
Speaker: Michele Ballinger, Sr. Product Marketing Manager, Dynamics 365

Enjoy a fun game and test your knowledge of how Microsoft is helping to transform the HR function. Show your Modern HR smarts and learn along the way. There will be lots of fun and prizes at this session – dont miss it!

Cultural Transformation at Microsoft
Monday, June 24 | 4:15 5:15pm | LVCC N109-114
Speaker: Joe Whittinghill, CVP, Talent, Learnings and Insights

In this session, we will share what Microsoft has learned throughout its ongoing cultural transformation. Over the last three years, Microsoft has been on a journey to rediscover its soul, transforming its culture from one of know-it-alls to one of learn-it-alls grounded in a growth mindset. Hear from Microsoft HR key leader and learner Joe Whittinghill, about this ongoing process.

We look forward to meeting you!

The post Essential tools to create a better workplace – Microsoft at SHRM 2019 appeared first on Microsoft Dynamics 365 Blog.

June release of Dynamics 365 Remote Assist adds small UI and other improvements

$
0
0

Applies to: Dynamics 365 Remote Assist (version 2.02)

In the June release, we updated our User Interface framework to make it faster to make updates and release new features. You may notice some small UI differences, but all the features are the same. Among the updates, the contacts list has a new look and feel, and OneDrive is built right into the call window.

OneDrive button

We also added support for contact names containing Japanese characters.

Japanese character in Contacts lists

For information on using Remote Assist, see the Remote Assist user guide.

The post June release of Dynamics 365 Remote Assist adds small UI and other improvements appeared first on Microsoft Dynamics 365 Blog.

Lifecycle Services – June 2019 (Release 2) release notes

$
0
0

The Microsoft Dynamics Lifecycle Services (LCS) team is happy to announce the immediate availability of the release notes for LCS (June 2019, release 2).

Prevent overlapping service requests against the Production environment

Under certain conditions, allowing overlap of service requests against the Production environment can prevent rollback support. Specifically, any time a deployable package is scheduled for the Production environment, a backup of the environment is taken. This backup would not be consistent if a golden database refresh or a point in time restore of the Production database was performed.

To help ensure that rollback support is always available for the Production environment, overlap of service requests on the Production environment will no longer be supported.

This means that scheduling a deployable package, Sandbox to Production database refresh, or a Production Point-in-time Restore service request cannot be requested while another service request is pending.

Overlap is not a common occurrence but will be important to note for customers planning execution times for a Mock Go Live or a full Go Live. We recommend that you apply the deployable package first, and after this is completed then request a copy of the Sandbox to Production refresh in preparation for the launch.

 

Self-service upgrade support for 10.0.3 with Platform update 27

The available upgrade versions for customers running application versions 7.0 thru 7.3 has been updated to include application release 10.0.3 with Platform update 27. For more details about which version to select, see Self-service Upgrade to the latest version https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/self-service-upgrade#understand-which-version-to-select-for-upgrade.

 

Update to the pause policy for One Version

We recently announced a flexible service updates policy for One Version. A customer will now have the option to pause up to 3 continuous updates if they are unable to take updates due to industry regulations or valid business reasons. With this release of LCS, we have enabled customers to participate in this policy, so they will now be able to pause up to 3 continuous updates via the Update settings page in LCS. To learn more about how to pause updates, see Pause service updates.

 

Cancel a customer scheduled update on a production environment deployed using self-service deployment

With this release of LCS, customers will now be able to cancel a scheduled deployment of an update of their production environment. This only applies to the updates scheduled by the customer on environments deployed using self-service deployment.

The post Lifecycle Services – June 2019 (Release 2) release notes appeared first on Microsoft Dynamics 365 Blog.

Upcoming changes to the update tiles in Lifecycle Services (LCS)

$
0
0

We want to notify you about the changes were making to the Lifecycle Services (LCS) update tiles for One Version environments . This change is planned to be released in the July 2019 update of LCS.

There will be a new update experience when taking updates from the environment detail page for environment running on v8.1 and later.

Currently, when taking updates from LCS environment detail page, theres one single update tile to get the latest cumulative update version generally available.

The following changes will be included in the new LCS update experience

  • Choose the version of updates available to your environment. In the new update experience, instead of having one update tile for the latest cumulative update, youll have the option to take quality update or take a new release generally available.
  • Quality update includes the fixes available to your current version. Quality update is available to environments that are running on the same version of the current service update, or is one version older than the current service update.
  • New release includes the current service update version automatically applied to customer environments based on the LCS project update settings. Service update is available if your environment has not updated to the current service update version.
  • New release may also include the upcoming service update version thats made generally available. Upcoming service update is available for self-update approximately 2 week before the auto-update schedule starts.

This change does not impact existing One Version service update policy, and the LCS update for One Version continues to provide the cumulative updates.

If your environment is on an earlier versions such as 8.0 or 7.x, there will be no change with your LCS update tile experience.

The post Upcoming changes to the update tiles in Lifecycle Services (LCS) appeared first on Dynamics 365 Blog.

Generate PDF quote documents based on standardized templates

$
0
0

When a customer is ready for a formal proposal that contains the most current pricing information and product quantities, they are generally presented with a quote. Until now, a salesperson using Dynamics 365 for Sales could generate quote documents only in Word format using Microsoft Office templates. They then had to convert the Word docs to PDF manually and then share them with customers. With PDF generation from quotes generally available, salespeople can now quickly create PDF quote documents based on standard templates to share with customers.

With PDF generation from quotes, salespeople can:

  • Create and download PDF documents from quote records based on standardized templates.
  • Quickly send emails to customers with PDF quotes attached.

PDF generation from quotes simplifies the effort and reduces the time it takes to complete one of the most frequently performed actions in a sales process.

Before we get into how to generate a PDF document from quote records, let’s review some prerequisites for accessing the feature. You need to:

  • Be using Dynamics 365 for Sales (app) version 9.0.1905.2010 or later or Dynamics 365 for Sales Professional (app) version 9.1.1904.1025 or later.
  • Enable PDF generation from quotes.
  • Have uploaded Word templates for quotes that can be used to generate the quote documents.

Check your version of Dynamics 365 for Sales

To check what version of Dynamics 365 for Sales you are using, select the Settings icon on the navigation bar, go to Advanced Settings > Settings > Customization > Solutions, and check the version of the Dynamics 365 for Sales application or Dynamics 365 for Sales Professional solution, as applicable. You need to be using Dynamics 365 for Sales (app) version 9.0.1905.2010 or later, or Dynamics 365 for Sales Professional (app) version 9.1.1904.1025 or later.

List of solutions with Sales app highlighted

Enable PDF generation from quotes

If you are using Dynamics 365 for Sales, you can enable the feature by going to Sales Hub>App Settings>Sales Administration>PDF generation.

PDF generation settings page to enable PDF generation

 

If you are using Dynamics 365 for Sales Professional, you can enable the feature by going to Sales Professional>Setup>Sales settings>Advanced settings>PDF generation.

PDF generation settings page to enable PDF generation

Upload standardized templates

To be able to create PDF documents from quote records, you must have uploaded templates for the quote entity, which provides the format of the document to be created. To learn more about how to create standardized Word templates and how to upload them, see Use Word templates to create standardized documents.

You can also use the two templates that come with sample data available for the application: Print quote for customer and Quote summary.

Create quote PDF

When the feature is turned on, quote records will start showing two buttons in the main form: Create PDF and Email as PDF.

To download a quote as a PDF document in conformance with an uploaded standardized template, open the quote record. Then select Create PDF, and select the template you want to use.

 

The application starts downloading the quote as a PDF document. You can save and use the quote document.

Sample of quote PDF

 

To share a quote with a customer, select Email as PDF in the quote form, and select the template. The application creates a draft email with the quote PDF document attached. You can complete the email and send it to your customer with just a few clicks.

Email as PDF button in toolbar of quote form

 

Draft email with quote

 

You can find documentation of the feature here: Generate a PDF document from a quote record. Check out the feature today, and share feedback on what more you would like to see on our ideas portal.

 

The post Generate PDF quote documents based on standardized templates appeared first on Dynamics 365 Blog.

June release of Dynamics 365 Import Tool (Preview) adds five new file formats

$
0
0

Applies to: Dynamics 365 Import Tool (build 111.1906.20006.0)

We’re pleased to announce that with the latest update of Dynamics 365 Import Tool (Preview), in addition to GLB files, you can now import FBX, OBJ, STL, PLY, and GLTF files.

Import Tool dialog box

This update makes it easier to import your CAD files for use with Dynamics 365 Layout, Dynamics 365 Guides, or Dynamics 365 Product Visualize.

If you use the Send to Microsoft option, you can now also specify what’s most important to you. For example, you can specify whether visual fidelity or performance is most important.

What

Learn more about using Dynamics 365 Import Tool (Preview).

The post June release of Dynamics 365 Import Tool (Preview) adds five new file formats appeared first on Dynamics 365 Blog.


Focusing on the customer experience at Customer Contact Week

$
0
0

As Ive been preparing to take off for Customer Contact Week (CCW) in Las Vegas, NV, this week, Ive also beenruminating on whatit means to lead a truly customer-centric service organization.

As I see it, customer service managersneed toremember that customer-centricityis aboutdesigning an experiencetailored tothe customeras an individualnot as a case number.In the nonstop flowof a call center, or even in back offices or in face-to-face customer service situations, it can be far too easy to lose sightofthe actions thattrulyhelp an individual in need versus the boxes checked to close a case on time.Aprocessmayat first feelcompletely customer-centric, but if it doesnt directlylend itselfto thecustomers expectations,it maybeworth a second look.

You canplanto modernize your customer service organization with new technology, new processes,and new ideasto help create a customer-centric experience,but where do you startwhenbudgetscontinue towaneascustomer expectationssoar?These are the questions Ill be asking myself as I meet with customers and industry leaders over the next few days.

Microsoft at Customer Contact Week

Microsoft is an exhibiting sponsor this week at CCW, one of the largest customer service focused industry events. Why CCW? The five-day conference in Las Vegas, NV, is where customer service and contact center leaders come together to share best practices in agent training, emerging call center technology, performance metrics, quality assurance, cost reduction, and other critical customer service priorities.

Sharing these best practices is a great start. But the next step is taking the learnings and designing a holistic customer experience that deepens loyalty and grows customer engagement. After all, if the solution is not customer-centric, its not a solution.

Microsoft will be demonstrating some of our latest customer-centric offerings around Dynamics 365 for Customer Service at CCW that focus specifically on the customer experience. We recently launched two AI-driven Dynamics 365 apps that can enhance the customer experience, Dynamics 365 Virtual Agent for Customer Service (currently in preview) and Dynamics 365 Customer Service Insights. These apps extend the rich capabilities of Dynamics 365 for Customer Service by leveraging artificial intelligence. By unifying technologies, these apps empower agents and personalize the customer experience.

Customer service AI takes center stage at CCW

In case youre not able to make it to CCW this year, I wanted to share an overview of some of our new customer service features and share some opportunities for you to check them out on your own time.

Lets start with AI-enabled virtual agents. One of the most significant issues in most virtual agent offerings is the inability to make the subject matter expert (SME) central to the development of the bot. Often, coding expertise is required to create a bot, which significantly lengthens the development cycle. With each string of code entered by the developer or data scientist, the SME becomes further removed from the content, which impacts the quality of the bot conversation and its ability to perform. And, poor bot performance negatively affects the customer experience.

Dynamics 365 Virtual Agent for Customer Service is all about the solution, not the technology. Virtual Agent brings AI to those that know customer service best. Subject matter experts can build, launch and maintain virtual agents without the support of developers or data scientists. Customer service SMEs can create quality conversations and can refresh bots ad hoc, avoiding long update cycles and without soliciting the help of intermediaries. By leveraging natural language capabilities and AI models that adapt, bots can understand customer intent, personalize the experience, handle routine tasks, and can take action on behalf of the customerleaving more complex issues to the live agent. By removing the complexity, Virtual Agent for Customer Service can help you focus on delivering a consistently positive customer experience.

Dynamics 365 Customer Service Insights helps enhance the customer experience through the power of AI. It digests volumes of data to visually display performance and operational metrics that are consumable and actionable. Built-in dashboards with interactive charts and visual filters identify opportunities for improvement that have the greatest impact. This helps customer service managers evaluate and respond to key performance indicators (KPIs) while enhancing customer satisfaction.

The customer experience is enriched through the power of uniting these AI-enabled apps, creating a complete customer-centric solution that is easily adapted to any organization.

Theres no need for FOMO if you cant experience our demos firsthand us at CCW this year! You can contact us any time to request a live demo or set up a free trial. We can also help you prioritize which opportunities will provide the greatest impact for your customer service organization so you can consistently deliver exceptional customer experiences.

For more information, visit Microsoft Dynamics 365 for Customer Service. While youre there, check out Dynamics 365 Customer Service Insights, and be sure to sign up for the public preview of Dynamics 365 Virtual Agent for Customer Service to learn how easy it is to create a bot to enhance customer self-service solutions.

The post Focusing on the customer experience at Customer Contact Week appeared first on Dynamics 365 Blog.

Automatically validate your solutions using the PowerApps checker PowerShell Module

$
0
0
A number of customers have automated build and release pipelines and a frequent ask, since announcing thegeneral availability of Solution Checker, is for the abilityto run checks outside of the user experience and in an automatedmanner. This is now possiblewith the newPowerApps Checker PowerShell module that we have released for preview in the PowerShell Gallery today!

 

Using the module, you can directly check your solutions from your pipelines and receive the a detailed report of issues similar to what was made available to you from thePowerApps maker portal.Alternatively, you can also allow your teams validate their solutions locally by using the interactive login mode to access the checker service.

 

With the module, you also have additional new capabilities and flexibility on the types of solutions you can check, the resources in your solution to check, and also which rules your solutions are checked against.
  • Validate both managed and unmanaged solutions (CRM 2011 to current)
  • Validate more than one solution at a time
  • Validate solutions contained in a package
  • Validate on-premise solutions
  • Rules available for ISV’s for AppSource certification
  • Exclude files from the analysis
  • Control which rules are included in the analysis

 

How do I get started?

There are a few prerequisite steps you will need to complete prior to using the PowerShell module to validate your solutions. You can follow the instructions in the PowerApps checker documentation to complete the steps below.
  • Install the module
  • Set up authentication and authorization in your Azure tenant

 

How do I check my solution or package?

A set of cmdlets are available to interact with the service in order run your checks. These cmdlets let you retrieve a listing of rules, rulesets and run an analysis job.

 

Detailed help and examples on how to use each cmdletare available to guide you through each step of the check process.

 

How do I access the results?

When running a check,you specify a folder to save the reports. Once complete,you will have access to a result object that contains the locationsof the reports and a summary of findings by severity. You can use this summary to make decisions in your automated processes without having to parse through the report files. The reports are downloaded in azip file with a reports in a JSON format. The JSON is formatted based on a standardization referred to as Static Analysis Results Interchange Format (SARIF). There are tools available to view and interact with SARIF documents. For more information refer to the SARIF projectsweb sitefor details.

 

Upcoming capabilities

For customers using Azure DevOps, we will soon be previewing a DevOps task to make integrating with the service even easier. This will provide all the capabilities of the PowerShell module, and also allow for viewing of results directly in the your build. Stay tuned for this announcement!

 

Another small improvement

For issues found with assemblies contained in your solutions, the location field on the report generated by the service contained an undefined reference. While it was possible to determine the assembly reference using the module/type/member fields, we have now improved this and included the assembly name in the location field to help you identify the issue locationfaster. This change is currently rolling out to the regions.

 

We are really excited to see what our community will do with this capability available to them to constantly review the quality of their solutions. If you have questions or other feedback you can provide itvia this post or on the PowerApps Community site.

The post Automatically validate your solutions using the PowerApps checker PowerShell Module appeared first on Dynamics 365 Blog.

Improved packing functionality (Dynamics 365 for Operations 1611)

$
0
0

Dynamics 365 for Operations 1611 – Improved packing functionality

 

Based on the input that we have received from many of our customers and partners, we have improved the packing station experience in Dynamics 365 for Operations version 1611 (November 2016) to make sure that it seamlessly integrates with the rest of the warehouse workflows.

On a high level, we have improved the following areas:

  • Setup experience is better aligned with the rest of Warehouse management.
  • The packing station is now seen as a location. When the warehouse workers are logging in at the packing station, they will only see and operate on shipments and containers that are planned at the specific packing location.
  • Work can now be generated to bring goods from the packing station to the staging and bay door locations.
  • The concept of container groups has been introduced to allow multiple containers to be moved out of the packing station in one operation.
  • A new packing policy has been introduced to give warehouse managers greater flexibility in how containers should be handled in the packing process.
  • The concept of manual manifesting has been introduced to allow a loosely coupled integration to external transportation provider systems.
  • The user experience for packing and container processing has been improved.

Upgrade from Dynamics AX 2012

If you are not upgrading from a previous Dynamics AX version where packing processes were used, you can skip this section.

The In packing status has been removed from the shipments and loads as they were not working consistently and resulted in redundant data. Consequently, the list pages for In shipments and Loads in packing have been deprecated. Containers in packing are tracked at a location level.

In previous versions, the packing location was defined as a Location profile ID. In the current version, this is changed so setup of packing location will be defined using location types to align with the process for identifying staging and final shipping locations.

It is possible to continue the operation with the current setting, but we recommend that you update the setting because the legacy packing setting will be deprecated in future versions.

Please note that this process is irreversible. After clearing and saving the Profile ID for packing location field, the field will be disabled and cannot be used anymore. For installations where the legacy has not been used, the legacy setting will always be disabled.

IMPORTANT UPGRADE GUIDELINES

  • All containers must be closed before upgrade. In previous versions, containers did not have a container packing policy assigned on creation. In the current version and going forward, container packing policy must be assigned to containers to process containers. This can be mitigated by using the Change container packing policy function on the Container form, but it is not a recommended approach.
  • In the current version and going forward, it is required that the locations that operate as packing stations must be LP controlled. If the packing station is not LP controlled, it is not possible to process containers after upgrade.
  • Before using the newly upgraded system, make sure that the right container packing policies have been defined and are associated with the correct packing profiles.

 pack1 

Define location profiles and packing locations

Set up packing location profiles and packing locations in the same way as setting up staging and final shipping locations.

Create a location type that identifies the packing.

pack2

Under Set up parameters for Warehouse management, select Pack in the Packing location type.

pack3

Create one or more location profiles that use the packing location type.

 pack4

Notice the setup for the packing location profile:

  • Use license plate tracking must be set to Yes.
  • Allow negative inventory should be set to No.
  • Allow mixed items should be set to Yes.
  • Allow mixed inventory statuses should be set to Yes.
  • Allow mixed inventory batches should be set to Yes.

Set up the locations that operate as packing stations to use the new packing location profile ID.

pack5

Use location directives to bring goods to the packing station

Overall, the concept of using location directives to bring goods to the packing station is not changed.

The setup for using work to move goods out of the packing station will be described later in this document.

 pack7

 pack8

Log in the packing station

Before the packing station can be used, the Dynamics 365 for Operations user account must be associated with a user and the user must be created as a warehouse worker as shown below.

 pack9

pack10

 

When navigating to the Pack form, the warehouse worker will be asked to log in the packing station by specifying the location of the packing station and the packing profile.

pack11

As it is very common that the same warehouse worker will work at the same packing station for a longer period, it is recommended to set up default values for the worker as shown below.

pack12

For the default packing station, it is possible to set up any combination. You can choose site only, site and warehouse, or even site, warehouse and location if the worker is always logging in the same packing station. All the values are the default values, but can be changed after login.

The default profile can be used for the warehouse manager to guide the warehouse worker at the packing station on what process to use when operating at the packing station, or it can be used for the warehouse worker to store favorite packing settings.

When selecting a Packing profile ID that has a Container packing policy associated with it, it is not possible to change the Container packing policy. If selecting a Packing profile ID without a Container packing policy, it will be possible to specify another default Container packing policy.

Set up container packing policies

The Close container profile field has been renamed to Container packing profile because it has a different impact on how containers are processed during packing.

In the previous versions, the Close container profile field was only used when a container was closed waiting for the decision as to which final shipping location the container should go to and what unit of measure should be used as the default value when weighing the container. As no work creation was supported, the container immediately appeared at the final shipping location after the container was closed.

In the current version, the Container packing policy defines how the container should be processed and consequently, it is applied immediately when a new container is created.

pack13 

On the Overview tab, it is still possible to specify the actions when closing the container, but now it is possible to operate with or without work creation as well as defining when the container should be released from the packing station.

Container release policy

Using this parameter, it is possible to define what should happen when the container is released from the packing station by specifying one of the following options:

  • Make available at final shipping location. This is the same as in the previous versions. As soon as the container is released, it is updated to the specified location for the final shipping location. When using this option, the field Default location for final shipment is enabled and used for specifying a preferred location for the container after closing it.
  • Create work to move container from packing station. Using this option will create work for moving the container from the packing station to the staging area or directly to the bay door. When using this option, the field Work template is enabled and can be used for specifying a work template that should be applied when creating the work for the container.

Work template

A new work order type called Packed container picking is introduced. The work order type is used to describe the work created after a container or container group is released from the packing station.

In most cases, it is recommended to create work for moving the containers as it results in a better representation of the actual manual processes in the warehouse. There might be setups that are very simple or where packing station is located directly in the bay door area where it would be preferable to let the container be available at the final shipping location.

It is not possible to use work breaks, but it is possible to set up different work templates for different warehouses depending on the warehouse or the shipping carrier.

The examples below show templates for moving a container from staging or directly to the bay door.

pack14

pack15

 

Container closing policy

By using this parameter, it is possible to define what should happen when closing the container by specifying one of the following options:

  • Automatic release. The container will be considered released from the packing station and the action specified under the Container release policy will be triggered.
  • Delayed release. The container will not be released from the packing station immediately. It will be up to the warehouse worker to release it at a later point in time.
  • Optional. During the process of closing the container, it will be possible to choose whether the container should be released at the closing time.

The setting of this parameter will depend on the nature of the individual customer and packing stations. If the packing station is mainly handling single container shipments directly to customers, it will be most natural to release the containers immediately. If the packing station is handling shipments with multiple containers or even pallets it will probably be optimal to delay the release until the entire shipment or pallet is packed and ready for pick up.

Weight unit

This parameter enables the user to choose a default unit of measure used for container closing and manifesting. Usually this will be the unit of measure of the scale used at the packing station.

The parameter will work for policies with or without work creation.

Manifesting

Manifesting is the process of specifying the weight of a container, container group or shipment as well as a tracking ID provided by the transportation provider.

There is no direct integration with external transportation provider systems. The warehouse worker must print the label from the external provider system and scan the tracking number when completing the manifest procedure.

As manifest requirements vary from customer to customer and even from shipment to shipment, the packing policies allow a lot of flexibility when it comes to the workflow. It is possible to set up manifests for containers, container groups and shipments in any combination.

If using a multiple level manifest procedure, it is a requirement that:

  • All containers must be manifested before container group is manifested.
  • All container groups must be manifested before shipment is manifested.

Manifesting will be described in more details when the workflow of the packing station is explained.

Container manifest

Container manifesting should be enabled if it is required to complete a manifest for every single container packed at the packing station.

Manifesting is activated by the parameter Manifest requirement for container. If this is set to Manual, the manifesting will be included as a requirement in the packing workflow. It will not be possible to close and release the container before the manifesting is completed. If the parameter is set to Transportation management, the manifesting will still be performed through the TMS rate engines. Please note that this requires partners to implement a specific engine for the transportation provider and will not work out of the box in the current version. 

If activating the Automatic manifest at container close, the warehouse worker must specify the manifest information as part of the Close container dialog to avoid a two-step process. This is usually the preferred setting if the same worker is packing and manifests the containers.

If activating the Print container content parameter, the container content report will automatically be printed as part of the container close. The report can of course also be printed and reprinted on demand.

Container group manifest

Container group manifesting should be enabled if it is required to complete a manifest for every single container group packed at the packing station. This will normally be used if containers are packed on a pallet and the entire pallet is manifested.

Manifesting is activated by the parameter Manifest requirement for container group. If this is set to Manual, the manifesting will be included as a requirement in the packing workflow. All containers included in the group must be closed before the group can be manifested.

There’s no transportation management engine support for container groups in the current version.

There’s no manifest report for container groups in the current version.

Shipment manifest

Shipment manifest should be enabled if it is required to complete a manifest for the entire shipment packed at the packing station. This will normally be used when one consolidated manifest is required even though the shipment consists of multiple containers or container groups.

Manifesting is activated by the parameter Manifest requirement for shipment. If this is set to Manual, the shipment manifest will be included as a requirement in the packing workflow. It will not be possible to release any containers on the shipment before the manifesting is completed. If the parameter is set to Transportation management, the manifesting will still be performed through the TMS rate engines. Please note that this requires partners to implement a specific engine for the transportation provider and will not work out of the box in the current version. 

If activating the Print packing slip parameter, the packing slip report will automatically be printed as part of the shipment manifest. The report can of course also be printed and reprinted on demand.

 

Packing station workflow

The first step for the warehouse worker is to log in the packing station. In the example below, we will log in the packing station at the location Pack.

pack16

Overall, the user interface for the packing station looks like the packing station in previous versions, but has several additions and workflow optimizations.

pack17

Prepare for packing

After the warehouse worker scans the shipment ID or the license plate identifying the shipment, the lines will be displayed and the packing of the shipment can be started.

Container packing policy

The new Container packing policy is now applied when containers are created. Normally the same packing policy will be used for a single shipment or even for the entire packing station.

Container group license plate ID

The Container group concept is introduced. This can be used if containers will be packed on a pallet and will be moved out of the packing station in one operation, instead of moving the individual containers.

Usually this field should be set up when starting to pack on a new pallet. The warehouse worker should scan the pallet license plate and it will work as a default container group for all new containers being packed and closed.

Please note that Container groups can only be used for containers that have a Container packing policy with delayed work creation.

pack18

It is possible to use an existing license plate, but if scanning a non-existing license plate, it will automatically be created.

License plates for containers that are already shipped should not be reused.

When closing the container, the default value will be used in the Close container dialog and it is possible to change it.

One way to add a container to a container group is through one of the containers forms, for example, Containers for shipment, Open containers at packing station, etc.

 pack19

Both open and closed containers can be added to a container group. However, there are several restrictions when this can be done. For example: released containers can’t be added to a container group and containers from different shipments or different container packing policies can’t be added to the same container group.

Through the container forms, it is also possible to remove container from a container group by just clearing the Container group license plate ID field.

Pack the shipment

The shipment can now be packed in one or more containers. The first step is to create a new container with the selected Container packing profile.

pack20

The new container will be automatically selected in the Open containers view and packing can now be started. The view will also show the status of the container, for example, how much is packed, is the manifesting requirement met for the container, etc.

 pack21 

When an open container is selected, the warehouse worker will be able to start packing by scanning items using the Item packing section.

pack22

A new feature has been added so that it is now possible to pack everything on the selected open line by using the Pack button on the Action Pane. This will allow the warehouse worker to speed up packing shipments where it is not necessary to scan and pack individual items.

 

Manifest the container

If the container is using a policy with manual container manifest requirements, the Manifest container dialog can be opened and the container weight and tracking number can be specified.

pack23

Close the container

When the container is packed and any container manifest requirements are met, the warehouse worker can close the container. As the container is already manifested, it will no longer be possible to change the weight or the container tracking number.

pack24

If the container is using a policy with manual container manifest requirements and the policy parameter Automatic manifest at container close is enabled, the worker will be able be scan the container tracking number as part of the closing process.

pack25

If the container is not using a packing policy with container manifest requirements, the tracking number will not be displayed in the Close container dialog and the Manifest container button will not be enabled.

pack26

After the container is closed, follow the actions specified in the Container packing policy.

  • If the policy parameter Container closing policy is set to Automatic release, the container will immediately be released when the container is closed.
  • If the policy parameter Container closing policy is set to Delayed release, the container will be closed, but will be pending until released by the warehouse worker.
  • If the policy parameter Container closing policy is set to Optional, the container will be closed and the worker can decide to release it as part of closing the container.

pack27

Delaying the release of the container can be very helpful in scenarios with or without work creation. In most situations, it would not be optimal to create work every time a container is closed, but rather wait until the entire shipment is packed. It can also be utilized when running without work creation and the containers are kept at the packing station until the truck arrives and the containers can be loaded.

Remember that an even more optimized work creation process can be achieved when using container groups for work creation container policies.

Please note that the old Close container form has been deprecated. In the case where it is needed to support the workflow where packing and closing take place in separate operations, the container closing should be performed on the Containers form.

 

Release the container

If the container is not part of a container group and there’s no requirement for shipment manifesting, the container can be released as part of the container close.

Depending on the Container packing policy, one of the following actions will occur when releasing the container:

  • If the policy parameter Container release policy is set to Make available at final shipping location, the container will immediately be available at the specified location when the container is closed.
  • If the policy parameter Container release policy is set to Create work to move container from packing station, work will be created for moving the container out of the packing station.

 

Unmanifest container

If the container is still open and manifested, it can be unmanifested from the Pack form. Containers cannot be unmanifested when they are closed or released.

 

Container group manifesting

If a Container group license plate is selected, the container group with the selected group license plate can be manifested using the Manifest container group button.

To manifest the container group, all manifest requirements for the containers in the group must be met and the containers must be closed.

When manifesting a container group, the following dialog will be shown and it is possible to specify the total gross weight and tracking number for the container group.

pack28

Unmanifest container group

If a Container group license plate is selected, the container group with the selected group license plate can be manifested using the Manifest container group button.

A container group cannot be unmanifested if the containers in the group are already released.

When the unmanifesting process is completed, the tracking number is removed from the container group and a confirmation message is displayed.

Manifest shipment

If a shipment is selected, the shipment can be manifested using the Manifest shipment button.

To manifest the shipment, all manifest requirements for the containers and the container groups in the shipment must be met and the containers must be closed.

When manifesting a shipment, the following dialog will be shown and it is possible to specify the total gross weight and tracking number for the shipment.

pack29

Unmanifest shipment

If a shipment is selected, the shipment can be unmanifested using the Unmanifest shipment button.

When the unmanifesting process is completed, the tracking number is removed from the shipment and a confirmation message is displayed.

View containers

There are different ways to view containers based on the context. When packing a shipment, it is useful to see containers that are part of the shipment.

The following form shows open, closed and released containers.

From this form, the packer can also perform all operations on containers such as closing, manifesting, reopening and releasing.

pack30It is also possible to see all containers that are physically at the packing station. The Packing station form has buttons that can be used to view all open and closed containers at the packing station. These views will not be restricted to a specific shipment.

pack31

pack32

The views can be very helpful in the situation where one worker is packing the containers and the other worker is manifesting and releasing the containers.

Furthermore, a consolidated view of all containers is also available. This will mostly be useful for users working outside the context of a single packing station.

 pack33

Work creation

If the packing station is operating with a container packing policy with work creation, work will be created when the container is released as shown in the example below, where we are using a work template including staging.

pack34

Work cancellation

If a container was released by mistake, it is possible to reopen it if the work is not started yet. If the warehouse worker reopens the container, the corresponding work will be cancelled.

If the other worker is cancelling the Packed container picking work, the corresponding container will also be reopened.

In scenarios where the container is already moved out of the packing station it is no longer possible to reopen the container by cancelling work. The container must be manually moved back to the packing station for reprocessing.

Work execution

The Packed container picking work will be executed through the mobile device using menu items for packed container picking and packed container loading as showed in the example below.

Packed container picking

pack35

pack36

 

Packed container loading

pack38

pack39

 

Work creation and possibilities for container groups

The work that gets created for the container groups is very similar to the one for just one container.

pack40

You will notice a couple of differences: Target license plate ID is the same as the Container group license plate ID and not the Container ID. The work quantity is also created from all the containers in the group.

The work execution itself is also similar.

pack41

pack42

There are some additional possibilities with the container groups that are not available to individual containers. Namely, it is possible to remove a container from a container group if the container group is at a staging location.

To make this possible, there is a new warehouse mobile device menu item called Remove container from group.

pack43

By default, when a container is removed from the group, then the related Packed container picking work is updated to consider that the container group now has a different number of containers on it.

It is also possible to turn on the Cancel related work when removing container from group setting. If that is the case, then the related Packed container picking work gets canceled when the container is removed from the group.

This is how the removing container flow looks like.

For example, let’s take the container group from the above work. It has three containers.

pack44

pack45

pack46

pack47

If we click the Selected container, we can see which containers are going to be removed. Please notice that none of the containers are removed at this point and that it is possible to get out of the flow by clicking the Cancel button.

pack48

When the Remove containers button is selected, the flow ends.

pack49

This is how the work looks like now.

pack50

The removed containers are now at the staging location and can be moved freely.

Use multiple packing stations

In this section, we’ll describe how to set up multiple packing stations and use them for packing the same shipment. Of course, it is possible to use multiple packing stations for multiple shipments.

There are multiple ways to end up in a situation where there are items from the same shipment at different packing stations, for example, overriding the put location and selecting the Full button during sales order picking work, but we’ll describe one way to do it.

First, let’s set up the work template. To guide items to different packing stations, there should be multiple works. To accomplish this, we’ll break work by Item ID (and shipment, but that is not relevant for our case).

pack51

Setup of the work template query.

pack52

Setup of the work breaks.

pack53 

Now, we need to set up the location directives.

We need to set up one location directive for the first item.

pack54

Location directive query.

pack55

Location directive action query.

pack56

The one for the other item is similar and it just filters for item A0002 and location Pack2.

If we now have a sales order with these two items and release it to warehouse, it will generate two work headers that will bring items to different packing stations.

pack57

pack58

pack59

The user can now proceed and pack the items at two different packing stations.

Currently, the interaction between the two packing stations is very limited. For example, it is not possible to move items between them even if the items belong to the same shipment. It is also not possible to add containers to container groups that are at different packing stations.

Also, after the container is closed and released, it is not possible to move the container back for repacking at a different packing station.

It’s important to notice that if the items are at the packing station, it is still possible to adjust items, leaving the container in an inconsistent state.

 

Announcing Dynamics 365 for Operations – Warehousing

$
0
0

We're very happy to announce that Dynamics 365 for Operations – Warehousing has been made available on Windows Store and Google Play store. This app empowers warehouse workers in your organization to complete tasks in a warehouse by using mobile devices. It enables material handling, receiving, picking, putting, cycle counting, and production processes with your Dynamics 365 for Operations subscription.

The Dynamics 365 for Operations - Warehousing app includes the following features to boost productivity:

  • A tailored interface designed for fast warehouse scanning
  • Supports over 40 different warehouse processes
  • Custom built on-screen numeric keypad for you to hit numbers easily
  • A simple calculator for you to enter and calculate quantities in a breeze
  • Possibility to adjust font size and width of input fields on any device

This blog post will take you through the prerequisites, how to navigate the app, and the options to configure the app in Dynamics 365 for Operations.

Prerequisites

The app is available on Android and Windows operating systems. To use this app, you must have one of the following supported operating systems installed on your devices. You must also have one of the following supported versions of Dynamics 365 for Operations.

Use the information in the following table to evaluate if your hardware and software environment is ready to support the installation.

Platform Version
Android 4.4, 5.0, 6.0
Windows (UWP) Windows 10 (all versions)
Dynamics 365 for Operations Microsoft Dynamics 365 for Operations version 1611

-or-

Microsoft Dynamics Dynamics AX version 7.0/7.0.1 and Microsoft Dynamics AX platform update 2 with hotfix KB 3210014

Install the app

The app is available for download here:

For detailed steps on how to install and configure the app, refer to this tutorial: Install and configure Dynamics 365 for Operations - Warehousing

Navigating the app

The app comes with a new user experience. In this section I will go through and show different pieces and elements that we have changed in the UI.

Log-in screen and menu

Once the app is installed and configured to connect to a Dynamics 365 for Operations instance, you will be presented with a log-in screen. Sign in with the User ID and password of the warehouse worker. Learn how to manage warehouse workers with this tutorial: Manage warehouse workers.

In the below image you can see the log-in experience, as well as the menu structure and navigation.

log-in-6

Task and details page

For our most common flows that follow the same pattern of scanning input fields, we have changed the UI to split all information into two pages, the task or details page. In the task page, the information shown will be the main input field, three rows of additional information, and a previously scanned value. Sometimes there's more information to any given screen than what can fit in three rows, and therefore we made the Details page, which will contain all overflowing information and input fields, as well as product picture in case that exists for the item. You can control in what order you want information to be prioritized to be shown on the Task page, this is done from the Warehouse app field priorities page in Dynamics 365 for Operations. This will be explained a bit further down in this blog post.

details-page-2

Numeric keypad

The app comes with a custom numeric keypad, specially designed with rugged environments in mind. It has large buttons that are easy to touch, and a nifty calculator for those occasions where quantities needs to be converted on the fly.

numeric-keypad-4

Alternatively, we have added a stepper for quantity input fields, where you can deduct or add to the quantity field without using the numeric keypad. This can be useful when it is not a high amount, and just a quick change is needed.

stepper-1

Multiple input fields

If there's multiple input fields in any given screen with values not seen before, the app will recognize this and display a different UI. If there is 3 or less input fields, a carousel will be shown, which will allow the warehouse worker to quickly switch between input fields, without leaving the task page.

carousel-3

If there's more than 3 input fields filled out and not seen before, a multi-input page with all input fields shown in a list will be displayed. The example below is during a movement of goods, where an existing license plate is scanned for movement, where the app receives information on what item, quantity, unit etc. that is on the license plate. It will then display that content with multiple input fields on the Task page, in order to enable the warehouse worker to quickly review, and move forward to the next step.

multi-input-4

Action pane

As you might have noticed from the previous pictures, there's not any buttons displayed on any of the screens except for the green OK button. We have deliberately moved all other buttons to an action pane, that is accessible from the hamburger menu in the top right corner.

action-pane-3

Settings in Dynamics 365 for Operations

There are two new pages added in Dynamics 365 for Operations:

  • Warehouse management > SetupMobile deviceWarehouse app field names
  • Warehouse management > SetupMobile deviceWarehouse app field priorities

I will explain below the use of these pages, and how they relate to the app.

Configure warehouse app field names

In Operations you can configure how metadata should be shown on a warehouse mobile device on the Warehouse app field names page.

In a new environment or company, you can select Create default setup to generate all field names that exist in any of the warehouse mobile device workflows, and assign them a default preferred input mode and input type.

Once you've generated a list of field names, the following options are available:

  • Preferred input modeThis option defines whether a scanning field or a manual entry input field should be shown for the selected field name. This is useful to distinguish fields depending on if barcodes are used for the field or not. Note: Field names with preferred input mode set to Scanning can still enter information manually in case the barcode is unreadable or damaged.
  • Input typeThis option defines what input type should be used for the selected field name. Four options are available:
    • Selection - The selection list is used for field names that contain a list of options to choose from. This option is not editable, and can only be change through extension.
    • Date - Field names specified as date will show a date format with the label, to help warehouse workers know which format to enter the date in. This option is not editable, and can only be changed through extension.
    • Alpha - If Alpha is selected, the device keyboard will be used when entering information manually in the app. The keyboard experience can change depending on the device used.
    • Numeric - For field names that you know will use numeric input only, you can select this option to display a custom numeric keypad with the input field instead of the device keyboard.

warehouse-app-field-names

Configure warehouse app field priority

In the Warehouse app field priority page, it is possible to put field names into different priority groups. This makes it possible to decide what information should be promoted to the main task page when warehouse workers are performing work using the app.

If you click Create default setup, a default set of priority groups will be generated. It is possible to create as many priority groups as needed, but only three priority groups will be shown on the task page of the app at any given time.

When Operations is sending out metadata to the app, it will give each field a relative priority depending on its priority group, and the app will display the first three priority groups contained in the metadata on the task page of the app. The rest of the overflowing metadata will be presented on a secondary details page.

warehouse-app-field-priorities

Summary

This blog post has provided a brief overview of Dynamics 365 for Operations - Warehousing. As always, we would appreciate any feedback you may have. We hope you enjoy using the app.

Financial dimensions on a purchase requisition

$
0
0

Concepts

Financial dimensions are available on each requisition line in the lines details. The purpose is to let you update the accounting distribution ledger account on the requisition line. The accounting distribution ledger account consists of a main account and financial dimension values. Accounting distributions are used to define how an amount will be accounted for, such as how the expense, tax, or charges will be accounted for and posted in the ledger.

Updates on the financial dimension values on the requisition line will update the accounting distributions ledger account, not the other way around.

Create a requisition line and set up initial financial dimensions values

When a purchase requisition line is created, the values of financial dimensions are assigned from various sources that are specific to the purchase requisition line.

This works such that the logic will apply values from what is considered the most important or specific source toward the least important source. The sources are ranked in the following order:

  1. Project
  2. Requester
  3. Vendor
  4. Item

If one of the sources is not on the requisition, e.g. not related to a project or does not include a reference to a specific item from the master data, then the source is ignored.

The financial dimensions are only assigned a value once. This is done by first taking all dimension values available from the project, and then taking the dimension values from Requestor that are used but not already assigned a value. Next, the values from the Vendor for dimensions that were not already assigned a value are used, and so on. In the following example, the default financial dimension values are set up for Vendor 1 and Requester.

Financial dimensions on different Sources Business unit (BU) Department (Dep)
Vendor 1 001 025
Requester (Worker) 033

 

Create a requisition line for an item that has the default Vendor1.

Business unit (BU) Department (Dep)
1. Create the line and select the item. Vendor 1 is defaulted to the line. 001 033

 

As you can see from the table above, the Requestor’s default value for financial dimension is Department (033). This value takes priority over the Vendor’s default value Department (025).

If the requisition is not for a project, then that source will not contribute to the financial dimension values.

The accounting distribution ledger account dimension values will be set according to the final result of the values on the requisition line.

Manually update the financial dimension on a requisition line

When you update a financial dimension value on the requisition line to a non-blank value, that will update the dimension values on the accounting distributions. If the accounting distribution lines are split, then all lines will be updated with the new value.

  • For non-stocked items and lines not referring to a catalog item - Setting a financial dimension value to blank on the requisition line will not update the accounting distribution.
  • For stocked items and fixed assets - Setting a financial dimension value to blank on the requisition line will update the accounting distribution by removing the dimension.

For stocked items and fixed assets, the financial dimensions need to be in sync between the financial dimension on the line and the accounting distributions. Note that the accounting distributions cannot be split or edited manually for such cases.

Update a vendor or project

Two of the original sources for setting financial dimension can be changed on the requisition line: project and vendor. Changing one of these on the requisition line will reinitialize the financial dimensions on the requisition line according to the ranking of the sources described above. The new initialization of the financial dimension values will only update the financial dimensions that do not have any value, i.e. are blank. The existing financial dimension values on the purchase requisition line will take priority, and thereby keep their value.

Changing the project or vendor will also mean that the accounting distributions ledger account is reset and updated based on the new project or vendor and the updated set of financial dimensions. If the accounting distributions are spilt, then the split will be removed during the reset.

Note that if you want to ensure that the financial dimensions are reset to how the requisition line was initially created when changing the project or vendor, clear all the financial dimensions on the requisition line that you want to re-default.

Example

Set up the following default values.

Financial dimensions on different Sources Business unit (BU) Cost center (CC) Department (Dep)
Vendor 1 001 025
Vendor 2 002 008
Vendor 3 003 009 026
Requester (Worker) 033

 

Create a requisition line for a non-catalog item.

Business unit (BU) Cost center (CC) Department (Dep)
1. Create a line without a Vendor, the Requester’s default Dep is added. 033
2. Select Vendor 1. Because BU is blank and Dep is not blank only BU is updated from Vendor. 001 033
3. Change to Vendor 2. Only CC is blank and can be updated from Vendor 2. 001 008 033
4. Delete BU on line details and Dep. Manually update the cost center to 010. 010
5. Select Vendor 3,
  • BU is blank and will therefore be updated with dimension from the Vendor 3's default dimension 003.
  • CC has a value 010 so it will keep that value and ignore Vendor 3’s CC 009.
  • Dep is also blank but since the Requester has a default dimension 033 this will overrule the Dep dimension 026 from the vendor.
003 010 033

 

Use a template to update the accounting distribution

You can only use a template for non-stocked items and lines not referring to a catalog item. The lines should also not be categorized as a fixed asset item.

When you apply a template on your requisition line it will update the accounting distribution directly with a split distribution on different dimensions. Applying a template will not change the financial dimension values on the requisition line.

However, if you manually change a financial dimension value to a non-blank value on the requisition line, then that dimension will be updated on all split accounting distribution lines. Clearing a financial dimension value on the requisition line will not change the accounting distributions.

If you change a vendor or project the financial distributions from the vendor or project will update the financial dimension on the requisition line as described above; however, using a template will prevent the accounting distribution split from being cleared. Any updated dimension value will update all the split lines in accounting distributions.

If you remove the template from the requisition line, then the accounting distributions will be reset based on the financial dimension values on the requisition line. Any split lines in the accounting distributions will be removed.

Customizing the Warehousing Mobile App

$
0
0

Introduction

We last looked at the Warehouse Mobile Devices Portal (WMDP) in detail in a series of blog posts here, here, and here.  The last one covered how to build custom solutions and walked through building a new sample workflow for the WMDP.  This post will be updating that sample to cover some of the changes that have occurred with the Advanced Warehousing solution and the Dynamics 365 for Finance and Operations - Enterprise Edition warehousing application.

WMDP vs Dynamics 365 for Warehousing Mobile App

The Warehouse Mobile Devices Portal (WMDP) interface, which is an IIS-based HTML solution (described in detail here), is being deprecated in the July 2017 release of Dynamics 365 for Finance and Operations (see deprecated features list here). Replacing this is a native mobile application shipping on Android and Windows 10 devices.  The mobile app is a complete replacement for the WMDP and contains a superset of capabilities – all existing workflows available in the WMDP will operate in the new mobile app.  You can find more detail on the mobile app here and here.

Customizing the new Dynamics 365 for Warehousing Mobile App

The process for customizing the new mobile app is largely unchanged – you can still utilize the X++ class hierarchy discussed in the previous blog post.  However - I want to walk through some of the differences that enable customizations to exist as purely extensions.  The previous solution required a small set of overlayered code.  Moving forward this practice is being discouraged and we recommend all partners and customers create extensions for any customizations.

As before, we will be focusing on building a new workflow around scanning and weighing a container.  The inherent design concept behind the Advanced Warehousing solution is unchanged – you will still need to think and design these screens in terms of a state machine – with clear transitions between the states.  The definition of what we will build looks like this:

WHSWorkExecuteMode and WHSWorkActivity Enumerations

Just as in the previous blog post – to add a new “indirect work mode” workflow we will need to add values to the two enumerations WHSWorkExecuteMode and WHSWorkActvity.  The new enum names need to match exactly as one will be used to instantiate the other deep inside the framework.  Note that both should be added as enumeration extensions built in a custom model.  Once this has been done it will be possible to create the menu item in the UI – since the WHSWorkActvity enumeration controls the list of available workflows in the UI:

You can see the extension enumeration values in the following screenshots:

  

WHSWorkExecuteDisplay class

The core logic will exist within a new class you will create, which will be derived from the WhsWorkExecuteDisplay base class.  This class is largely defined the same way as the WMDP-based example, however there is now a much easier way to introduce the mapping between the Execute Mode defined in the Menu Item and the actual class which performs the workflow logic – we can use attributes to map the two together.  This also alleviates the need to overlay the base WHSWorkExecuteDisplay class to add support for new derived classes (as the previous WHSWorkExecuteDisplay “factory method” construct forced us to do).

The new class will be defined like this:

[WHSWorkExecuteMode(WHSWorkExecuteMode::WeighContainer)]
class conWhsWorkExecuteDisplayContainerWeight extends WhsWorkExecuteDisplay
{
}

Note that all the new classes I am adding in this example will be prefixed with the “con” prefix (for Contoso).  Since there is still no namespace support it is expected partner code will still leverage this naming scheme to minimize naming conflicts moving forward.

The displayForm method is required – and acts as the primary entry point to the state machine based workflow.  This is completely unchanged from the previous example:

[WHSWorkExecuteMode(WHSWorkExecuteMode::WeighContainer)]
class conWhsWorkExecuteDisplayContainerWeight extends WhsWorkExecuteDisplay
{
    container displayForm(container _con, str _buttonClicked = '')
    {
        container    ret = connull();
        container    con = _con;

        pass = WHSRFPassthrough::create(conPeek(_con, #PassthroughInfo));

        if (this.hasError(_con))
        {
            con = conDel(con, #ControlsStart, 1);
        }

        switch (step)
        {
            case conWeighContainerStep::ScanContainerId:
                ret = this.getContainerStep(ret);
                break;

            case conWeighContainerStep::EnterWeight:
                ret = this.getWeightStep(ret, con);
                break;
       
            case conWeighContainerStep::ProcessWeight:
                ret = this.processWeightStep(ret, con);
                break;

            default:
                break;
        }

        ret = this.updateModeStepPass(ret, WHSWorkExecuteMode::WeighContainer, step, pass);

        return ret;
    }
}

A detailed analysis of this code can be found in the previous blog post – we will skip forward to the definition of the getContainerStep method, which is where the first screen is defined.  The two methods used to define the first screen are below:

private container getContainerStep(container _ret)
{
    _ret = this.buildGetContainerId(_ret);
    step = conWeighContainerStep::EnterWeight;

    return _ret;
}

container buildGetContainerId(container _con)
{
    container ret = _con;

    ret += [this.buildControl(#RFLabel, #Scan, 'Scan a container', 1, '', #WHSRFUndefinedDataType, '', 0)];
    ret += [this.buildControl(#RFText, conWHSControls::ContainerId, "@WAX1422", 1, pass.lookupStr(conWHSControls::ContainerId), extendedTypeNum(WHSContainerId), '', 0)];
    ret += [this.buildControl(#RFButton, #RFOK, "@SYS5473", 1, '', #WHSRFUndefinedDataType, '', 1)];
    ret += [this.buildControl(#RFButton, #RFCancel, "@SYS50163", 1, '', #WHSRFUndefinedDataType, '', 0)];

    return ret;
}

Note that I am using a class to define any custom constants required for the Warehousing logic.  This was typically done with macros in the previous version – but these can cause some issues in extension scenarios.  So instead we are encouraging partners to define a simple class that can group all their constants together – which can then be referenced as you see in the code above.  The only area where this does not work is in attribute definitions – this will still need a Macro or String definition.  Here is mine so far for this project:

class conWHSControls
{
    public static const str ContainerId = "ContainerId";
    public static const str Weight = "Weight";
}

The other important thing to notice in the above code is that I have explicitly defined the data type of the input field (in this case extendedTypeNum(WHSContainerId)).  This is important as it tells the framework exactly what type of input field to construct – which brings us to the new classes you need to add to support the new app functionality.

New Fields

In the previous version of this blog we discussed the fact that since we are adding new fields to the warehousing flows that are not previously handled in the framework we must modify (i.e. overlayer) some code in the WHSRFControlData::processControl method.  This allows the framework to understand how to handle the ContainerId and Weight fields when they are processed by the WMDP framework.

In the new model these features are controlled through two new base classes to customize and manage the properties of fields.  The WHSField class defines the display properties of the field in the mobile app – and it is where the default input mode and display priorities are extracted when the user configures the system using the process described here.  The WhsControl class defines the logic necessary for processing the data into the field values collection.  For my sample, we need to add support for the ContainerId field – so I have added the following two new classes:

[WhsControlFactory('ContainerId')]
class conWhsControlContainerId extends WhsControl
{
    public boolean process()
    {
        if (!super())
        {
            return false;
        }

        fieldValues.insert(conWHSControls::ContainerId, this.data);

        return true;
    }
}

[WHSFieldEDT(extendedTypeStr(WHSContainerId))]
class conWHSFieldContainerId extends WHSField
{
    private const WHSFieldClassName Name = "@WAX1422";
    private const WHSFieldDisplayPriority  Priority    = 65;
    private const WHSFieldDisplayPriority  SubPriority = 10;
    private const WHSFieldInputMode InputMode = WHSFieldInputMode::Scanning;
    private const WHSFieldInputType InputType = WHSFieldInputType::Alpha;

    protected void initValues()
    {
        this.defaultName = Name;
        this.defaultPriority = Priority;
        this.defaultSubPriority = SubPriority;
        this.defaultInputMode = InputMode;
        this.defaultInputType = InputType;
    }
}

Obviously my conWhsControlContainerId class is not doing much – it is just taking the data from the control and placing it into the fieldValues map with the ContainerId name – which is how I will look for the data and utilize it later in the system.  If there was more complex validation or mapping logic I could place that here.  For example, the following is a snapshot of the process logic in the WhsControlQty class – this manages the logic for entering in quantity values from the mobile app:

public boolean process()
    {
        Qty qty = WHSWorkExecuteDisplay::str2numDisplay(data);
        if (qty <= 0)
        {
            return this.fail("@WAX1172");
        }

        if (mode == WHSWorkExecuteMode::Movement && WHSRFMenuItemTable::find(pass.lookup(#MenuItem)).RFDisplayStatus)
        {
            controlData.parmFromInventStatusId(controlData.parmInventoryStatusSelectedOnControl());
        }
        else
        {
            controlData.parmFromInventStatusId(controlData.getInventStatusId());
        }

        if (!super())
        {
            return false;
        }

        if (mode == WHSWorkExecuteMode::Movement && fieldValues.exists(#Qty))
        {
            pass.parmQty(qty ? data : '');
        }
        else
        {
            fieldValues.parmQty(qty ? data : '');
        }

        //When 'Display inventory status' flag is unchecked, need the logic for #FromInventoryStatus and #InventoryStatusId
        this.populateDataForMovementByTemplate();

        return true;
    }

The buildGetWeight method is very similar to the previous UI method – the only real difference is the Weight input data field.  Note that we don’t need to define a custom WHSField class for this field because it already exists in the July Release.

Error Display

There was another minor change that was necessary before I could get the expected behavior, and it points to a slight change in the framework itself.  In the previous version of the code when I reported that the weight was successfully saved I did so with an “addErrorLabel” call and passed in the WHSRFColorText::Error parameter to display the message at the top of the screen.  This same code in the new warehousing app will now cause the previous step to be repeated, meaning I will not get the state machine transition I expect.  Instead I need to use the WHSRFColorText::Success parameter to indicate that I want to display a status message but it should not be construed as an error condition.

container processWeightStep(container _ret, container _con)
…
ttsBegin;
containerTable = WHSContainerTable::findByContainerId(pass.lookupStr(conWHSControls::ContainerId),true);
if(containerTable)
{
    containerTable.Weight = pass.lookupNum(conWHSControls::Weight);
    containerTable.update();
    _ret = conNull();
    _ret = this.addErrorLabel(_ret, 'Weight saved', WHSRFColorText::Success);
    pass.remove(conWHSControls::ContainerId);
    _ret = this.getContainerStep(_ret);
}
else
{
    _ret = conNull();
    _ret = this.addErrorLabel(_ret, 'Invalid ContainerId', WHSRFColorText::Error);
    pass.remove(conWHSControls::ContainerId);
    _ret = this.getContainerStep(_ret);
}
ttsCommit;

 

Caching

The mobile app as well as the AOS perform a significant amount of caching, which can sometimes make it difficult to add new classes into the framework.  This is because the WHS code is heavily leveraging the SysExtension framework.  I find that having a runnable class included in the project which simply calls the  SysExtensionCache::clearAllScopes() method can help resolve some of these issues.

Conclusion

At this point I have a fully functional custom workflow that will display the new fields correctly in the mobile app.  You can see the container input field and weight input below.  Note that if you want to have the weight field display the “scanning” interface you can change the “preferred input mode” for the Weight EDT on the “Warehouse app field names” screen within the Dynamics 365 environment itself.

 

The Dynamics 365 for Operations project for this can be downloaded here.  This code is provided “as-is” and is meant only as a teaching sample and not meant to be used in production environments.  Note that the extension capabilities described in this blog are only available in the July 2017 release of Dynamics 365 for Finance and Operations or later.

Check out the new simplified way to configure the Cost Accounting module

$
0
0

In the latest version of Dynamics 365 for Finance and Operations, Enterprise Edition we put a lot of effort to make it easier to create initial configuration of the Cost Accounting module.

Take a look how simple it is!


What’s new in CU13 for WMS and TMS

$
0
0

Cumulative update 13 for Microsoft Dynamics AX 2012 R3 is now available for download on Lifecycle Services, PartnerSource and Customersource. In this blog post, we will give you an overview of the feature improvements for warehouse and transportation management. If you want more details about the release of CU13, see the Dynamics AX In-Market Engineering blog. The knowledge base article for Cumulative Update 13 is KB4032175.

 

  • Packed goods can now be brought from the packing station to the staging area or loaded directly on a truck.
  • Ability to move one item or the entire license plate (LP) although there is replenishment work behind.
  • Transfer order receiving and returns are now enabled as part of Mixed pallet receiving.
  • Guided partial cycle counting at a location has been enabled.
  • Short picking - Inventory reallocation. Ability to pick items from another location in short picking scenarios.
  • Use the demand replenishment method in the raw material picking process.
  • Correct reservation status after re-marking of WHS items with the Storage dimension enabled.
  • Timing of planned production orders when overlap jobs are enabled and the route operations use different calendars.
  • Product confirmation requested by the system before a put is completed.
  • Ability to report consumption of staged and order picked material.
  • Plan purchase orders through TMS in an inbound process when using Change management.
  • Pick (other) work order lines even if one of the order lines is blocked by demand replenishment.
  • RAF\Transfer Order integration. Finished goods can be cross docked to bay door locations providing an alternative to the put-away process where finished goods from the production output would normally be put in the Finished goods put location.
  • With a maximum weight or volume on the work template, the work split is now based on the directive unit and not on the lowest unit of measure.

 

List of feature enhancements Description of enhancements
Packed goods can now be brought from the packing station to the staging area or loaded directly on a truck. The packing station experience has been improved to ensure that it will seamlessly integrate with the rest of the workflows in the warehouse.

This is a backporting of an enhancement added to a later version of the product.

 

Ability to move one item or the entire license plate (LP) although there is replenishment work behind If the Allow movement of inventory with associated work check box on the Work users page is enabled, you can now move part of or an entire license plate that is tied to replenishment work.
Transfer order receiving and returns are now enabled as part of Mixed pallet receiving With this enhancement, it is also possible to scan the product and then update the quantity manually.

The traditional way of using the system is still supported.

Guided partial cycle counting at a location has been enabled. The changes in the hotfix include adding support to do partial cycle counting. Work line breaks are added to the cycle counting work template and partial cycle counting work will be generated during cycle counting planning.

This is a backporting of an enhancement added to a later version of the product.

 

Short picking - Inventory reallocation. Ability to pick items from another location in short picking scenarios. Items in short picking scenarios can now be picked from another location which enables a process where goods can be shipped as fast as possible. Note that this feature requires kernel 6.3.1000.1928 (KB3048540) or higher.

For more informatin, see Set up short picking item reallocation.

Use the demand replenishment method in the raw material picking process. A wave template can now be constructed for raw material picking and a demand replenishment method can be added for production orders and kanbans. The capabilities correspond to the capabilities for sales order picking and transfer order picking.
Correct reservation status after re-marking of WHS items with the Storage dimension enabled. Warehouse items with the Storage inventory dimension enabled now get the correct reservation status after re-marking.
Timing of planned production orders when overlap jobs are enabled and the route operations use different calendars Planned production orders were proposed too early when overlapping jobs were enabled and the route operations used different calendars. This issue has now been resolved so that in this situation, planned production orders are proposed duly.
Product confirmation requested by the system before a put is completed. For cluster picking, piece-by-piece picking can now be enabled to have the system request a confirmation before a put is completed.

 

For more information, see: Product confirmation for cluster picking.

Ability to report consumption of staged and order picked material. It is now possible to report consumption of material that is either reserved or picked.
Plan purchase orders through TMS in an inbound process when using Change management. Purchase orders in an inbound process can now be updated from Transportation management when Change management is activated. Previously, this would require that the entire purchase order would have to be routed through the approval steps once again.
Pick (other) work order lines even if one of the order lines is blocked by demand replenishment. The replenishment work blocking policy can now be set up to allow that users can pick (other) work order lines even if one of them is blocked by demand replenishment.

This is a backporting of an enhancement added to a later version of the product.

RAF\Transfer Order integration. Finished goods can be cross docked to bay door locations providing an alternative to the put-away process where finished goods from the production output would normally be put in the Finished goods put location. When reporting as finished, goods can now be cross docked to a bay door location based on a transfer order demand.

A transfer order that is released to warehouse will generate work of the type Transfer issue and this work can then be picked from the production output location and put to a location that is determined by the Transfer issue location directive for work.

With a maximum weight or volume on the work template, the work split is now based on the directive unit and not on the lowest unit of measure. Previously, when you set up a maximum weight or volume on the work template, work would split on the lowest unit of measure (UOM). Now work splits on the directive unit.

 

 

Report the value of physical locations

$
0
0

Key concepts

 

Inventory dimension

An inventory dimension can have a Physical value and a Financial value. The setting of the Physical value controls whether a dimension is active for Inventory management and Warehouse management. The setting of the Financial value controls whether a dimension is active for Inventory accounting in Cost management.

Note: An inventory dimension can have an active Physical value but no Financial value. However, it can’t have an active Financial value but no Physical value.

Cost object

The term cost object was introduced in Microsoft Dynamics 365 for Finance and Operations, Enterprise edition. It represents a key concept that is used in the management of business costs. A cost object is an entity that costs and quantities accumulated for. For example, a cost object entity can be either a product or product variants, such as variants for style and color.

A cost object is derived from the item ID and the inventory dimensions that have an active Financial value.

There are three groups of inventory dimensions: product, storage, and tracking. Each inventory dimension can have a set of dimensions associated with it. For each dimension, you can set up the following inventory dimension values.

Product dimension group Storage dimension group Tracking dimension group
Dimension Value Dimension Value Dimension Value
Color Financial by default Site Financial by default Batch Physical and or Financial
Size Financial by default Warehouse Physical and or Financial Serial Physical and or Financial
Configuration Financial by default Location Physical Owner Financial by default
Style Financial by default Inventory status Physical
  License plate Physical

Example: Define a cost object

When an item is created, a set of inventory dimension groups can be assigned to it. The following table shows how you can define a cost object.

Product dimension group Storage dimension group Tracking dimension group
Color Financial by default Site Financial by default
  Warehouse Physical

 

In the following example, the cost objects are defined by Item + Color + Site.

Examples:

  • Speaker + Black + Site 1
  • Speaker + White + Site 1
  • Speaker + Black + Site 2

The inventory objects can be used to report the physical quantity at any level in the warehouse that is defined by Item + Color + Site + Warehouse.

Examples:

  • Speaker + Black + Site 1 + WH 11
  • Speaker + Black + Site 1 + WH 12
  • Speaker + White + Site 1 + WH12

It’s crucial that you understand the concepts. The configuration and implementation of these concepts have a significant impact on the whole system, especially in Inventory management and Cost management.

After the configuration is implemented, it’s almost irreversible. Any change will require significant resources and will affect system usage itself.

In the rest of the document, we will use the Speaker item as an example. The inventory valuation method is set to Moving average.

Cost object:

  • Speaker + Black + Site 1

Inventory objects:

  • Speaker + Black + Site 1 + WH 11
  • Speaker + Black + Site 1 + WH 12

After a few transactions have been posted, the following inventory transaction entries are generated in the Inventory subledger.

Color Site Warehouse Financial date Reference Status Quantity Cost amount
Black Site 1 WH11 1/1/2017 Purchase order 01 Purchased 1.00 10.00
Black Site 1 WH12 2/1/2017 Purchase order 02 Purchased 2.00 26.00
Black Site 1 WH11 3/1/2017 Sales order 01 Sold -1.00 -12.00

 

The inventory close job was run as of January 31, 2017. Because the inventory valuation method was Moving average, no adjustments were posted.

As part of the fiscal period–end process, an Inventory value report that shows the ending inventory balance in quantity and value is required. To meet this requirement, the inventory value report framework was introduced. The framework lets you create custom reports by including more data points that depend on the type of business. It also lets you define the level of aggregation for cost objects.

Note: The inventory value report is designed to print only the values per cost object or aggregations of cost objects.

You create an Inventory by cost object report, based on the configuration in the following table.

FastTab group Field group Field Setup value Setup value
General Range Posting date
Columns Financial position Inventory Yes
  Resource ID View Yes
  Inventory dimensions Color View (Column) Yes
  Inventory dimensions Site View (Column) Yes
  Inventory dimensions Warehouse View (Column) Yes
  Average unit cost Calculate average unit cost Yes
Rows Resource type Materials Yes
  Detail level Level Totals

 

The report will look like this.

Resource Color Site Warehouse Quantity Value Avg. unit cost
Speaker Black 1 2.00 24.00 12.00

 

Note: The Warehouse column will remain blank, because the Speaker item doesn’t have any cost object that includes the Warehouse dimension. Inventory dimension Warehouse is only set to Physical.

View the inventory value by physical location Warehouse

Configure the storage dimension group

To meet the customer’s request, you could configure the storage dimension group differently. In this case, the Warehouse dimension is configured so that it has a Financial value.

Product dimension group Storage dimension group Tracking dimension group
Color Default Financial Site Default Financial
Warehouse Financial

 

This configuration affects how the Speaker item is handled by the system. The cost object and inventory object now have the same level of granularity.

Cost objects:

  • Speaker + Black + Site 1 + WH 11
  • Speaker + Black + Site 1 + WH 12

Inventory objects:

  • Speaker + Black + Site 1 + WH 11
  • Speaker + Black + Site 1 + WH 12

The configuration also directly affects the inventory valuation. In this example, the FIFO, Weighted average, or Moving average inventory valuation method will be applied per cost object, and the overall result will differ.

Color Site Warehouse Financial date Reference Status Quantity Cost amount
Black Site 1 WH11 1/1/2017 Purchase order 01 Purchased 1.00 10.00
Black Site 1 WH11 3/1/2017 Sales order 01 Sold -1.00 -10.00

 

Color Site Warehouse Financial date Reference Status Quantity Cost amount
Black Site 1 WH12 2/1/2017 Purchase order 02 Purchased 2.00 26.00

 

The result will also differ when the Inventory value report is run by using the same configuration that is described in the previous section.

Resource Color Site Warehouse Quantity Value Avg. unit cost
Speaker Black 1 WH12 2.00 26.00 13.00

 

The Warehouse column now has a value, and the inventory value is 26.00 instead of 24.00.

Note: When you activate the Financial value for the Warehouse inventory dimension, you might affect performance. All transfers between warehouses are now considered financial movements, and financial movements can cause cycles in the Inventory close job. If the Warehouse inventory dimension is used only to physically track inventory, these will be closed as non-financial transfers before the cost calculation begins and the changes of cycles reduced.

Create a custom report that looks at inventory transactions and settlements.

You can create a custom report that sums inventory transactions and settlements by InventDim ID.

The old Physical inventory by inventory dimension report was designed for that purpose. In the report dialog box, users could select any inventory dimensions, regardless of whether they were part of the defined cost objects.

This approach works, provided that the inventory dimensions that you select are part of the defined cost object. However, if you select an inventory dimension that isn’t part of the cost object, the report starts to print incorrect results.

The following table shows the result of printing balances per item and inventory dimensions.

Resource Color Site Warehouse Quantity Value Avg. unit cost
Speaker Black 1 WH11 0.00 -2.00 0.00
Speaker Black 1 WH12 2.00 26.00 13.00

 

Note: The inventory cost is calculated at a level above the inventory dimension and warehouse. Therefore, the cost on issue transaction from warehouse WH11 explains why the inventory value per warehouse can become negative.

Report the value of physical locations

If the value of a physical location must be reported, a sum of transactions per location, as described in the previous section, isn’t the solution.

The correct approach is to calculate the value per location by using the following simple formula:

Value = Cost object, cost × physical object, quantity

Cost object:

  • Speaker + Black + Site 1
Resource Color Site Quantity Value Avg. unit cost
Speaker Black 1 2.00 24.00 12.00

 

Inventory objects:

  • Speaker + Black + Site 1 + WH 11
  • Speaker + Black + Site 1 + WH 12
Resource Color Site Warehouse Quantity Formula Value
Speaker Black 1 WH11 0.00 12.00 × 0.00 0.00
Speaker Black 1 WH12 2.00 12.00 × 2.00 24.00

 

In Microsoft Dynamics AX 2012 R3, a new report that is named Inventory aging was introduced. As the name of the report implies, this report does more than just report the value by physical location. It can also show the age of the current inventory in the user-defined buckets.

In the report dialog box, enter the following information.

Field group Field Setup value
As of date 31-01-2017
View Item number View
  Color View
  Site View
  Warehouse View
Aging period Unit of aging period Days
  Aging period 1 30
  Aging period 2 60
  Aging period 3 90
  Aging period 4 120

 

The following table shows only the first section of the Inventory aging report, but the user-defined buckets have been omitted.

Resource Color Site Warehouse On-hand Quantity On-hand Value Inventory value quantity Inventory value Avg. unit cost
Speaker Black 1 WH11 0.00 0.00 2.00 24.00 12.00
Speaker Black 1 WH12 2.00 24.00 2.00 24.00 12.00

 

Conclusion

If your organization must provide inventory value by any physical location, you don’t have to update the current configuration of inventory dimension groups. This change can be very intrusive, and also affects the cost calculation and performance. We also don’t recommend that you build a custom report for this purpose.

The Inventory aging report is designed to calculate the cost per cost object and then can apply it to any physical level that is selected on the report. This report is designed to automatically detect the level that the cost object is defined at per item. It then applies the formula to calculate the value by physical location.

 

Negative inventory in inventory accounting

$
0
0

Allowing physical negative inventory may have undesirable consequences in inventory accounting, especially if the inventory costing principle is Actual and the valuation method is either FIFO or Weighted average.

Most of the issues that are related to physical negative inventory can be mitigated by using the correct configuration and maintenance of data.

Example: Why is the cost out of sync?

The following table lists the required setup for the Item model groups.

Item Inventory model Physical negative inventory Latest cost price Active planned cost
A FIFO Yes Yes No

 

The purchase from the supplier always takes place at a unit cost of 7,500.00.

The following table lists the events as they occur, in chronological order.

Financial date Reference Receipt Issue Quantity Cost amount
10/6/2017 Sales order 01 Sold

-3.00

10/6/2017 Purchase order 01 Purchased

2.00

15,000.00

10/6/2017 Sales order 02 Sold

-3.00

-22,500.00

10/6/2017 Purchase order 02 Purchased

1.00

7,500.00

10/6/2017 Sales order 03 Sold

-3.00

-22,500.00

10/6/2017 Purchase order 03 Purchased

3.00

22,500.00

10/6/2017 Purchase order 04 Purchased

2.00

15,000.00

10/6/2017 Purchase order 05 Purchased

3.00

22,500.00

10/6/2017 Sales order 04 Sold

-1.00

-18,750.00

 

The system starts issuing from inventory at cost per unit of 18,750.00 even though the cost of purchase has not exceeded 7,500.00.

Why does this happen?

In order to explain this in better detail, lets add a few more columns on the rightmost side, in green. These new columns represent the inventory balance after posting the specific transaction. The inventory balance is also known as InventSum.

Financial date Reference Receipt Issue Quantity Cost amount Quantity Value Avg. unit cost
10/6/2017 Sales order 01 Sold

-3.00

1)

-3.00

$0.00

$0.00

10/6/2017 Purchase order 01 Purchased

2.00

15,000.00

-1.00

$15,000.00

-$15,000.00

10/6/2017 Sales order 02 Sold

-3.00

-22,500.00 2)

-4.00

-$7,500.00

$1,875.00

10/6/2017 Purchase order 02 Purchased

1.00

7,500.00

-3.00

$0.00

$0.00

10/6/2017 Sales order 03 Sold

-3.00

-22,500.00 3)

-6.00

-$22,500.00

$3,750.00

10/6/2017 Purchase order 03 Purchased

3.00

22,500.00

-3.00

$0.00

$0.00

10/6/2017 Purchase order 04 Purchased

2.00

15,000.00

-1.00

$15,000.00

-$15,000.00

10/6/2017 Purchase order 05 Purchased

3.00

22,500.00

2.00

$37,500.00

$18,750.00

10/6/2017 Sales order 04 Sold

-1.00

-18,750.00 4)

1.00

$18,750.00

$18,750.00

 

Notes:

  1. The Cost per unit is 0.00. When the balance is negative, the system looks for a fallback cost to apply.
    1. First, the system looks for an active cost. This fails.
    2. Second, the system looks for cost set up in the Cost price field in the item master record. The cost is equal to 00.
  2. The Cost per unit is 7,500.00. The balance is still negative. The system looks for a fallback cost to apply.
    1. The system looks for an active cost. This succeeds.
    2. The latest cost price was set to Yes on the item record. The prior Purchase order unit cost of 7,500.00 has now become the active cost.
  3. The same condition applies as in number 2.
  4. The Cost per unit is 18,750.00. If the balance is positive at the time of posting the issue transaction, the system applies the average cost of the balance.
    1.  Average cost is calculated as: 37,500.00 / 2.00 = 18,750.00

The issue is that the inventory balance of 1 piece at 18,750.00 is overvalued because the item has never been purchased at a cost higher than 7,500.00. The reason for this overvaluation is that the first issue transaction leaves inventory at a cost per unit of 0.00. The wrong cost estimation will continue to ripple through the following transactions.

The only way to get the inventory balance back in sync is to run either the Recalculation or Inventory close jobs.

Financial date Reference Receipt Issue Quantity Cost amount Qty Value Avg. Unit cost
10/6/2017 Sales order 01 Sold

-3.00

-3.00

$0.00

$0.00

10/6/2017 Purchase order 01 Purchased

2.00

15,000.00

-1.00

$15,000.00

-$15,000.00

10/6/2017 Sales order 02 Sold

-3.00

-22,500.00

-4.00

-$7,500.00

$1,875.00

10/6/2017 Purchase order 02 Purchased

1.00

7,500.00

-3.00

$0.00

$0.00

10/6/2017 Sales order 03 Sold

-3.00

-22,500.00

-6.00

-$22,500.00

$3,750.00

10/6/2017 Purchase order 03 Purchased

3.00

22,500.00

-3.00

$0.00

$0.00

10/6/2017 Purchase order 04 Purchased

2.00

15,000.00

-1.00

$15,000.00

-$15,000.00

10/6/2017 Purchase order 05 Purchased

3.00

22,500.00

2.00

$37,500.00

$18,750.00

10/6/2017 Sales order 04 Sold

-1.00

-18,750.00

1.00

$18,750.00

$18,750.00

30/6/2017 Sales order 01

-22,500.00

30/6/2017 Sales order 04

11,250.00

1.00

7,500.00

 

The Inventory close will apply the selected Inventory model, in which case is FIFO, and then adjust the cost on the issue transactions accordingly.

Note: If the inventory balance is negative when executing the inventory close, the balance will not be adjusted. A full adjustment can only occur when the balance is positive, and enough receipts exist that can adjust the issues.

Conclusion

By default, all items should have an active cost. If you plan to allow temporary physical negative inventory, which is a valid scenario in certain business, its essential to apply an active cost before creating any transactions on the item.

WMSI/WMS2 item migration

$
0
0

Introduction

This blog describes the new capabilities that allow you to migrate existing items with open inventory transactions so they can use the new storage dimensions. This can be needed when you upgrade from older versions of Microsoft Dynamics 365 for Finance and Operations that supported the pallet dimension or if you want to use the Warehouse management functionality for existing items that were previously using WMS1 processes.

Typical migration scenarios:

  • You have existing items with the location dimensions enabled.
  • You have existing items with the location dimensions enabled and the pallet dimensions active.
  • You have existing items with the location enabled which are catch-weight items.

The goal when migrating items to use warehouse management enabled processes

The following setup is required to use an item in warehouse management processes:

  1. The item must use a storage dimension group that is set up to use warehouse management processes which means that the Inventory status dimension, the location and the license plate must be active.
  2. A reservation hierarchy must be assigned.
  3. A unit sequence group must be assigned.

The goal of the migration is to enable the items to meet the above  criteria and ensure that all related data is consistent with the items’ new settings allowing business processes to proceed after the migration.

 

High-level overview of the process

The upgrade process – blocking items

If you upgrade from a version that supported pallets the upgrade will identify the items that had the pallet dimension active and create a record in a new table called InventUpdateBlockedItem. This is done to block the items from all inventory processes because the item is configured using unsupported settings.

Items that are blocked must be migrated. The blocked items can be viewed in the Items blocked for inventory updates form.

The migration cockpit Change storage dimension group for items

The migration cockpit is simple. It allows you to enter the item ID of the items you want to be migrated and the new storage dimension group. If the item is to be downgraded from a group with the pallet dimension active, this is all you need.

If the item should be converted to use warehouse management enabled processes, you need to assign a reservation hierarchy and a unit sequence group.

The illustration below is a screenshot of the form.

Using entities to populate the data

You can use both OData which allows you to use Excel directly and data management to import and export the data.

OData

You can use the OData approach by clicking the Open in Microsoft Office icon and export the data to Excel. Once the information has been entered, the changes can be synchronized back.

Data management

For larger datasets, data management is the most effective. The entity is called Item storage dimension group change request and follows normal data management patterns.

 

Validation

Before the migration is started, the validation should be performed to ensure that the data is ready for migration. Several conditions are validated:

  • A default inventory status value must be defined
  • Inventory on-hand on pallets must not exist on non-license-plate-tracked locations
  • Inventory on-hand without pallets must not exists on license-plate-tracked locations
  • That the combination of the selected storage dimension group and reservation hierarchy is valid
  • If migrating to use warehouse management processes, the item cannot be enabled with catch weight

 

The list is not exhaustive, but it covers the most important validation points.

Starting the migration

The migration is started by clicking the Process changes button.

The migration supports parallel processing for parts of the process. The batch framework is used and the different steps will be handled by different batch tasks.

The recommendation is to set the Recommended number of parallel tasks field to two times the number of cores that are available. It is recommended to have a dedicated batch server for the migration since the migration is a heavy process due to the updates to the inventory dimensions.

Un-supported scenarios

Catch weight enabled items:  Catch weight enabled items are not supported in the new WHS. If such items exist, and they have the pallet dimension active, they will need to be downgraded to a dimensions group where only the location is active., Otherwise an ISV solution should be used.

Reserve Ordered transactions: Because of the way reservations work for warehouse management enabled items there are certain constraints on the state of the inventory transactions. If there are Reserved ordered transactions with a “hole” in the dimensions, the item cannot be converted.

A “hole” could exist for an item that has the batch dimension active and has the batch placed below the location in the reservation hierarchy. If a reservation exists on site, warehouse, batch, then the location is missing, which is what we refer to as a “hole”. This is not supported.

The mitigation is to either assign the missing dimensions or clear the dimensions so the “hole” is removed.

Customization that involves inventory dimensions

If you have customizations related to inventory dimensions that fall in one of these two categories:

  • New inventory dimensions field on existing table
  • New table with and inventory dimensions

You need to do something before you can use the migration process. Otherwise you will get an error like the one below:

Since a new inventory dimensions field exists, the system needs to know what actions should be taken for the dimensions in the table. This is done by implementing an event handler.

Three situations where an inventory dimension field is added:

  1. You do not want the data updated
  2. You want the data updated and the table has the itemId on it
  3. You want the data updated but the table does not have the itemId on it

 

In the below examples we will assume that we have added a table looking like this:

Handling situation 1 and 2

The below eventhandler illustrates how an eventhandler for situation 1 and 2 can be implemented. Here we want to update the dimensions related to the InventDimIdAllDimensions but not the dimensions in InventDimIdOnlyProductDimensions

[SubscribesTo(classStr(InventItemInventoryDimensionConversionTaskInitiator), delegateStr(InventItemInventoryDimensionConversionTaskInitiator, tableWithInventDimIdDiscoveredDelegate))]
    public static void tableWithInventDimIdDiscoveredStorageConversionDelegateHandler(
        TableId _updateTableTableId,
        FieldId _inventDimIdFieldId,
        InventItemInventoryDimensionConversionType _conversionType,
        EventHandlerResult _result)
    {
        if (_updateTableTableId == tableNum(MyOwnTableWithInventDimId))        {
            
            if (_inventDimIdFieldId == fieldNum(MyOwnTableWithInventDimId, InventDimIdAllDimensions))
            {
                InventItemInventoryDimensionConversionTaskCreator creator = InventItemInventoryDimensionConversionTaskCreator::newStorageConversion();
                creator.createTasksForTableWithItemId(
                    _updateTableTableId, //tableId of the table that should be updated
                    _inventDimIdFieldId, //fieldId of the inventDimId field that should be updated
                    fieldNum(MyOwnTableWithInventDimId, ItemId),  //fieldId of ItemId field
                    fieldNum(MyOwnTableWithInventDimId, DataAreaId)); //field id of DataAreaId
                _result.booleanResult(true); //we need to update
            }
            else if (_inventDimIdFieldId == fieldNum(MyOwnTableWithInventDimId, InventDimIdOnlyProductDimensions))
            {
                _result.booleanResult(false); //no update needed
            }
        }

Handling situation 3

Situation 3 is different since the itemId is on a different table. Here we need to write code specifically to the datamodel for the involved tables. The best approach is to follow the existing examples in the code. The InventBatchJournalResult table is a good example. This table is more complex because it has an InventDimId, but the itemId is on the inventBatchJournal table. The code below shows how this scenario is handled.

public class InventItemInventDimConversionInventBatchJournalLinePopulationTaskProcessor implements InventItemInventoryDimensionConversionITaskProcessor
{
    public boolean process(InventItemInventoryDimensionConversionTask _conversionTask)
    {
        var queryBuilder = InventItemInventoryDimensionChangePopulatorItemIdTableJoinedQueryBuilder::newFromParameters(
            _conversionTask.UpdateTableName,
            _conversionTask.InventDimIdFieldId,
            _conversionTask.DataAreaIdFieldId,
            tableStr(InventBatchJournal),
            fieldNum(InventBatchJournal, ItemId));

        InventItemInventoryDimensionChangePopulator::newFromQueryBuilder(queryBuilder).populateDimensionChanges();

        return true;
    }

}

Additional information

You can find more information as part of the product documentation:

https://docs.microsoft.com/en-us/dynamics365/unified-operations/supply-chain/warehousing/upgrade-migration-warehouse-management-processes

Customizing the warehouse mobile app: multi-scan pages

$
0
0

Introduction

This is another blog post in the series about warehouse mobile devices in Dynamics 365 for Finance and Operations. In the last blog post, the difference between customizing for WMDP and the warehouse mobile app was discussed. This blog post will be walking you through a new control scheme that was recently released, explaining how it unlocks new potential for partial offline processing in the warehouse mobile app.  This new functionality is called “multi-scan” and it enables a user to perform a series of offline scanning operations and then return them all to the server in one round trip operation. The goal for this control scheme is to allow for very quick scanning operations (many scans per second as an example) in high transactional warehouses where the standard model of a server round trip after each scan will not scale. Especially in sequential operations where the user does not need to look and verify on the device after each scan, but rather just need to register all scans in one go.

Multi-scan Functionality

If you download the latest version of the warehouse mobile app, you will have some new capabilities in the demo-mode which shows how this new control pattern works.  Once you have enabled the demo mode you should see the following menu:

Cycle counting is the flow we have enabled in the demo with multi-scanning to demonstrate the new capabilities.  It is designed to simulate a user performing a spot cycle count at a location in a warehouse or retail store where there are many items to scan. Currently this in only available in demo mode of the app, there is no support for this functionality when connected to a Dynamics 365 for Finance and Operations environment.

The first screen that is displayed is a location scanning screen – you can enter (or scan) anything in the demo mode here to move to the next screen.

Once you have scanned the location, the app enters the multi-scanning mode.  This is the new control that is being introduced in this release, so let’s go through the different UI elements that have been introduced to support this new flow.

This is the initial screen – you can tell it is the multi-scanning interface because of the new list icon in the bottom left corner; clicking the list icon will show you the list of items you have scanned so far. The checkbox icon in the bottom right is used to report to the app that you are done scanning and it is the only time the processing returns to Dynamics 365 – everything else will take place within the app locally on the device.

Once a worker starts to scan barcodes (or enter data manually into the app) the UI will change slightly.  Every item scanned will be added to an internal buffer and the number of items scanned will be displayed in the main UI.  For example – after a few scans the UI will now display the scanned count of three:

At any time, the user can click the list icon in the lower left, which will then display the list of items that have been scanned (in this example perhaps product barcodes in the location).  The UI for this looks like the following:

This lists the barcodes that have been scanned as well as a count of the times they have been scanned by the user.  This is very useful in the counting scenario, as a user can simply scan each product’s barcode to generate a count of items at that location.

You might note that there are two disabled buttons at the bottom of the screen.  These become active when a row is selected by the user in the list – as you can see below:

The edit icon on the left allows you to manually change the number of scans for the selected row. The icon on the right with the “X” deletes the selected row in case something was scanned accidentally.  

The edit icon will open a new screen with the numeric stepper UI allowing the user to quickly increment or decrement the number of scans or click on the value to open the numeric keyboard:

When clicking on the value, the numeric keyboard will open. As the number of barcodes cannot be negative, or integers, buttons that aren’t relevant for this use case has been disabled:

Returning to the main screen (by clicking the back button in the upper left corner) we are ready to submit the scanned list of items and their counts to the server. We do this by clicking the checkbox button in the bottom right – this is when we finally make the round-trip to the server and communicate with Dynamics 365. Later in the blog post the API will be explained and how to consume the scanned items and their quantities in X++ code.

In the demo flow, the next screen that is displayed is the following list of items which are not present in the location:

This is the second control pattern that we have introduced as part of this release – it allows a workflow to display a list of items (for example barcodes or product UPCs) and then allow the user to “scan to remove” from the list.  In this demo example we are displaying the items that were found in the cycle count but are not currently registered as on-hand for this location; the intention is that the warehouse worker would double check this list and scan any items that were indeed found as an extra validation check.  Scanning “T0001” in the above screen would then remove this from the list – and remember that this is all done client-side at this point. It is also possible to click on any value in the list and remove it.  Then when the user clicks that checkbox/submit button the new list of items would be submitted to the server for processing through a X++ workflow.

Custom Workflow

Hopefully that walkthrough gives you some idea of the capabilities we have added with these two new client-side control screens.  It is important to know that we have not currently added any multi-scan capabilities to the core product yet – the above cycle counting workflow is just a demo inside the app.  The goal of introducing these new control screens is to enable partners and customers to build new workflow-based solutions in the mobile app that support client-side driven scanning operations.  As such let’s walk through a simple customization example to show how the new control screens can be utilized in a real workflow.

Page Patterns

The way to enable the multi-scan screen is through a Page Pattern.  This might not be something you are aware of in the mobile app, as most of the time this is handled for you by the standard framework.  The Page Pattern is what tells the mobile app what type of UI to display on the device itself.  If you look at the WHSMobileAppPagePattern Enum you can see the different options available:

  • Default
    • This is the page pattern used for 90% of the screens in the app. It displays a primary scanning UI and a set of controls in the secondary tab – of which a few can be promoted to the first screen.  An enter and cancel button and an optional set of additional buttons in the menu are supported.
  • Custom
    • This Page Pattern is not used in many places in the core mobile flows – it is designed to allow partners to convert their old WMDP pages into the new model. Using this pattern will render the controls as it was done in WMDP  – each control simply vertically stacked in a single screen.
  • Login
    • This is used for the initial login page.
  • Menu
    • The Menu screens are rendered with this Page Pattern.
  • Inquiry
    • This Page Pattern support the workflows that allow the user to search for something and then see the results – such as LP or Item lookup screens.
  • InquiryWithNavigation
    • This is the Page Pattern that supports the Worklist view in the app. It is similar to the Inquiry pattern, except that includes some sorting options as well as the tiles are navigable.
  • MultiScan
    • This is the new pattern that has been added which will display the multi-scan UI shown in the demo above.
  • < MultiScanResult>
    • Note that as of the 8.1.1 release there is one missing and will be added in an upcoming release. If you want to enable a workflow to use the second screen described above – the “result list” of items, you would need to add a new Enum and return the value MultiScanResult. 

The actual job of returning the Page Pattern to the app is done through a class which derives from WHSMobileAppServiceXMLDecorator. This abstract class has a “requestedPattern” method that can be overridden to return the specific Page Pattern that is necessary.  This is typically done through a workflow-specific factory class that understands the correct workflow steps and thus can return the correct XMLDecorator class depending on the stage in the state machine.

For example – here is the standard factory class for the Work List functionality.  You can see that it typically will return the WHSMobileAppServiceXMLDecoratorWorkList object – which will render the work list Page Pattern as you would expect, however if the user has switched to the edit filter view then we need to display a different Page Pattern – thus the factory has the context to make this switch.

Multi-Scan API

Now that we know how to enable the Multi-Scan UI through a Page Pattern, we need to understand the basic API for passing the scanned items back and forth.  Once the MultiScan Page Pattern is requested, the first input control registered on the page will be used for the multi-scan input.  Remember that most of the UI interaction is all done client-side – so the only thing the server X++ code needs to do is define this control and the data that it contains.

When the user clicks that “submit” check box and sends the multi-scan data back to the X++ code, this is formatted in a very specific way.  The actual parsing of the data is done using the same interaction patterns as before – it will be stored in the result pass object for the specific control defined as the primary input of this page.  But the data will be passed in this format:

                         <scanned value>, <number of scans>|<scanned value>, <number of scans>|…

Thus, in my demo example above the data that the server would receive would be the following:

                         BC-001,2|BC-002,1|BC-003,1

In the X++ code you would then be responsible for parsing this string and storing the data in the necessary constructs.  We will see a simple example in a moment of how to parse this data.

Asset Scanning

The workflow I will demonstrate is very similar to some of the WHS workflow demos we have described in previous blog posts.  In this flow we will be scanning a container and then capturing all the “assets” that are stored in that container.  Imagine that these assets are very numerous and thus are stored outside of the standard batch tracking mechanisms in AX – they are implemented in the sample as a simple asset table associated with a container.  The assumption here is that scanning assets needs to be extremely quick and thus the offline “multi-scan” mode is the perfect solution.

The state machine for this workflow is similar to our previous examples – we will have an initial scan container screen, which will transition into the multi-scan enabled “Scan Assets” state – and finally when the list is returned we process the assets and return to the initial state. 

You can see this reflected in the core displayForm method below.  We will not be covering some of the lower-level details of the code – please review the earlier blog posts for details on the enums and control classes necessary to facilitate the new workflow.  All the code necessary for the solution can be downloaded at the end of the post if you want to dig into the details.

The getContainerStep is identical to our previous examples – it simply shows a simple UI and grabs the container ID from the user.  The getAssetStep method validates this container ID and calls the buildGetAssets method – which is where the UI for the multi-scan screen is built.  This is copied below:

As you can see this does not look much different that standard WHS code we have done previously.  The first input control (in this case the Asset ID field) will be used as the multi-scan field, but this code does not need to be modified in any way to support the multi-scan Page Pattern.  Instead what we need to do in ensure the correct Page Pattern is returned to the app during the correct workflow step.  To make this happen I have added a new DecoratorFactory class which will return a WHSMobileAppServiceXMLDecoratorMultiScan object at the appropriate step for my workflow – which in turn is what renders the Page Pattern to the app.

Please note the attribute at the top of this class – it is the same WHSWorkExecuteMode mapping attribute used for the WHSWorkExecuteDisplayAssetScan class in the code sample above.  This is how the framework knows that this specific decorator factory class is used for this work execute mode – the enum-based attribute ties these classes together through the sysExtension framework.  The key point here is that if you need a custom decorator factory to define when exactly to switch to multi-scan mode, the above example is how you will enable this.

In the final workflow step we need to process the incoming multi-scan results. As mentioned before these are returned to the server in the same way as normal data – it will simply look like a specially formatted string in the value of the input control.  Recall the discussion above with the format of the string being <scanned value>, <number of scans>|… In my simple example below, I am parsing this string using X++ and saving the assets to a new table associated with the Container.  In this case I am not making use of the second piece of information in the collection – the number of scans was not necessary in this case.

Hopefully it is clear how we loop through all the scanned assets and save each one to the new table.  After this is complete, we reset the workflow and move back to the first stage in the state machine.

Example Workflow

Now that you have seen the code to enable this in a custom workflow, let’s walk through this demo.  You can download the complete code for this project in the link at the bottom of this post – you just need to get it up and running on a dev environment and configure the necessary menu items to enable the workflow for your system.

The initial screen shows the Container ID scanning field.  Not that in the sample project I have included the necessary class to default this to the scanning mode – however you will need to set these up in Dynamics as defined here.

Scanning a container id (CONT-000000001 works if you are in USMF in the Contoso demo data) will navigate you to the next screen and enable the multi-scan Page Pattern.

Here you can enter any number of assets and the app will store them into the local buffer.  As we described above you can view the scanned assets by clicking the icon in the lower left.  After a few scans we would see the UI updated:

Clicking the list icon would show us the scans we have performed offline:

Finally clicking the “submit” button on the main screen will push the items to the server, which will then be saved to the custom table we have and the UI will display the success message.

Conclusion

Hopefully this help you understand the new control scheme that was added and how it can enable fast scanning operations.  The code used for this demo is available to download here – please note that this code is demonstration code only and should not be used in a production system without extensive testing.

Viewing all 350 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>