Chrome Extension – Ukrainian Phonetic Keyboard Mapping

There is a educational need for Ukrainian students to utilize the Ukrainian Phonetic keyboard when using their chrome book to write documents or papers. That being said, there was no Ukrainian phonetic chrome extension available and the one that was provided here: https://github.com/nagornyi/ukrainian-phonetic-keyboard-chromeos did not fulfill the requirements needed for students using a particular chrome […]
Leveraging PySpark to Automate Jira Ticket Creation

Trying to create a pyspark script in Microsoft Fabric to automate JIRA API ticket creation (post) calls against a lakehouse delta table. To do so, I have to educate myself on the pyspark syntax and framework. I will be doing so in this blog post. To reference a lakehouse delta table: Result: This was able […]
Automating Ticket Migration for Jira

I have been tasked with automating the migration of over 100k SolarWinds Helpdesk tickets over to Jira Service Management as we are preparing for the helpdesk transition. Plan: My current plan is to create an ETL on Microsoft Fabric that ingests CSV ticket data from Solarwinds and manipulate field values to fit our specific Jira […]
Microsoft Fabric & Spark SQL

The most common way to work with data in delta tables in Spark is to use Spark SQL. Lets say we have a table in OneLake called products: In the connected Spark SQL, we can insert data like so: When you want to work with delta files rather than catalog tables, it may be simpler […]
Microsoft Fabric: Optimize Performance by Partitioning Delta Tables

In OneLake, we are able to partition data so that performance enhancements could be made through data skipping. Consider a situation where large amounts of sales data are being stored. You could partition sales data by year. The partitions are stored in subfolders named “year=2021”, “year=2022”, etc. If you only want to report on sales […]
Microsoft Fabric: Optimizing Database Table Architecture

Very simply, we are able to optimize lakehouse data table architecture. In the case where Spark is used, Parquet files are immutable and as such, we end up storing a lot of small files. The Optimize function allows us to reduce the number of files written as it just collates them into larger files. Within […]
Microsoft Fabric: OneLake Data Upload/ Table Transition/ Query & Report Building

To take raw data files in a csv file and transition it to a table format: In OneLake, access Files > data > ellipses > upload > upload files. Here, you will be able to upload a csv file or an excel file. From the data file, select the ellipses > Load to Tables > […]
Prevent Posting of Transactions W/O Specific Dimensions in Business Central

Navigate to the Dimensions page and select the required dimension and select Account Type Default Dim. under the edit list tab. Set TableID to the appropriate table. In my case, I set it as 15: G/L Account. In addition, set the Value Posting field to Code Mandatory.
Adjusting Item Sales Unit Price in Business Central

In the case where a company wants to adjust an item’s unit price based on a flat profit rate of 20%, you have two options: 1. Run the Adjust Item Costs/Prices batch job: Fill in what to adjust, the adjust field, adjustment factor (multiply the adjustment field amount by this value) and rounding method. 2. […]
Direct Cost Applied Accounts in Business Central

When purchasing inventory in Business Central, we see that it creates 4 entries: The reason for this is that having separate purchase accounts in BC provides an easy way to view inventory purchases. It allows for reporting to capture cost of goods sold more clearly as COGS = Beginning Inventory + Purchases – Ending Inventory […]