Attend a Live WEBINAR about BA Training on 03-February-2016 @8:00 PM CST ‪#‎ZaranTech‬

Time : Wednesday February 03rd, 2016 @ 8:00 pm CST

You are most welcome to join our Upcoming batch, details of the same is as follows:

Attend a Live Demo Session Click here to Register
Batch Start Date : 5th Feb, 2016 @ 07:30 pm CST
For More Information
Class Schedule Tue-Wed-Fri 07:30 pm CST

Contact : Lakshmi@ 515-309-2128, Email

Demo Video by Trainer Sai

Attend a Live WEBINAR about BA Training on 03-February-16 @8:00 PM CST. Register Link  –

Five Steps to Understand Product Costing in SAP FICO

In controlling module, product costing is used to value the internal cost of materials and production for management accountability and accounting. Many people avoid due its complexity, as integration of high cost with other modules.

The five steps in understanding product costing in SAP FICO are:

product costing

Now let learn each step of product costing in detail.

Step #1: Cost Center Planning

Cost center planning is the elementary step in understanding product costing. The main objective of this phase is to plan total dollars and quantities in each cost center plant.


  • Company codes and plants in organizational structure are planned.
  • Master data for profit centers, cost centers, primary and secondary cost elements and activity types.

In transaction KP06, cost center dollars are scheduled by Activity type and cost element. Fixed and variable dollars can be entered. User can plan costs in production cost centers which wind up through allocations. In transaction KP26, the cost center quantities are planned by Activity type. Based on the earlier year’s actual values, activity rate can be manually entered. Planning activity quantities based on useful installed capacity accounts for interruption is the best practice.

Step #2: Activity Rate Calculation

The main aim of this phase is to estimate the rates of each activity plan in each cost center in a plant.


  • Cost Center Plans are entered: Plan costs in KP06 and Plan activity units in KP26

Once we plan our cost center dollars and quantities, it’s time to calculate the activity rates which are implemented to value internal activities to produce products. We can also use a blended approach and plan rates for few cost centers and activities and to calculate other rates based on the last activities.

Once we plan costs for all cost centers, we can avoid the next step of plan allocations. Use plan assessments and distributions to allocate costs when the planned costs acquired in overhead cost centers.

The key dissimilarity between assessments and distributions is that distribution keeps the primary cost element (Identity) of the cost. Assessments are secondary cost elements which act as a cost shipper to move costs. We can use assessments, distributions or blended approach of both. The plan assessments and distributions are created in Transactions KSV7 and KSU7 and executed in KSUB and KSVB transactions.

Once the costs are assigned, we must review the Cost center Actual/Plan/Variance report. Now, execute the cost center plan which rips cost when we have more than one activity type. The cost has to be ripped based on the activity quantity and other source. Using Transaction KSPI, activity type rates are calculated. If the cost is adverse, you can revise the cost plans and recalculate the rates.

Step #3: Quantity Structure

This step helps you to estimate the components of manufactured goods, cost of sold goods based on the BOM and Routing.


Master data is created:

  • Material Masters (including MRP, Accounting and Cost views)
  • Bill of Materials (BOM)
  • Work Centers (Cost Centers and Activity Types)
  • Routings (Product Planning) or
  • Master Recipes (Production Planning – Process Industries)
  • Production Versions
  • Product Cost Collectors (Production Planning Repetitive Manufacturing)

Quantity Structure is a key concept. It is a fundamental integration point between Finance and Logistics modules. There are several components of Quantity Structure namely:

  • In a product, a material master with a distinctive fit/form in a plant. It contains many views such as Material Resource Planning (MRP) views, accounting views and costing views. Procurement type and special procurement are the two key fields in costing. The procurement field refers to a material which is created internally, purchased or both. Whereas special procurement refers to a material which is sub-contracted, purchased from another plant.
  • Bill of Materials (BOM) is created for each internally produced material. The BOM list contains the component materials and quantities required to produce a semi-finished or finished good. Depending on the price control with standard or variable average price of the BOM components, the material cost of the product is calculated.
  • A work center identifies a machine or work area where a production process is performed. In addition to BOM, a routing is created to indicate the processes necessary to produce a material. In production planning, a routing has series of operations which also includes work centers and activity quantities.
  • A master recipe is used for batch-oriented process manufacturing. Rate routings and product cost collectors are used in repetitive manufacturing. Product cost collectors are created for each production version.

Production versions refer to a combination of a BOM and master recipe or routing required for material production.

Step #4: Costing Run

Costing run is used to cost mass volumes of materials in a particular company code. This allows user to select materials, detonate quantity structure, cost, analyse, mark and release.


  • Material Masters (MRP, Accounting and costing views)
  • Quantity Structure (BOM, Master Recipe or Routing and Production versions)
  • Condition types and production Information records
  • Configuration
  • CO Master Data

Materials are costed for the duration of the annual or monthly costing process. To execute costing runs, analyse results, mark and release costs transaction CK40N is used. This can be formed using controlling area, costing version, costing variant, company code and transfer control. Therefore, costing run can only be made for one company at a time. It has also created for a specific range of date.

The costing run as 6 steps namely:

  1. Selection
  2. Structure Explosion
  3. Costing
  4. Analysis
  5. Marking
  6. Release

After executing each step, error log has to reviewed and resolved. Execute each and every step after resolving the errors. If in case the results do not update after execution, press the refresh button.

Step #5: Actual Cost

This is determined through actual expenses, purchase price and conformed production quantities. These costs are matched to the standard costs through variance analysis to identify profitability and make decisions on management.


  • Material Masters (MRP, Costing and Accounting views)
  • Quantity Structure (Routers/Master Recipe, BOM and Production versions)
  • Configuration (WIP, Variance or settlement)
  • CO Master Data (Activity types, Actual and Primary and secondary cost elements)
  • Assessment/Distribution Cycles, Actual Statistical Key Figures

The production confirmation includes, product cost by order, actual production yield, scrap, and activity quantities. The production costs are composed on the production orders for review and settlement. In product cost by period, product cost collectors are used to calculate WIP, variances, and settlement instead of the planned orders.

In repetitive manufacturing, the quantities established based on the target cost created on the valuation variant for WIP or scrap. In discrete manufacturing, WIP is the dissimilarity between debit and credit of an order.

The variance analysis of input and output side is offered by SAP Finance training.  Finally, we must settle our orders or product cost collectors. Product Cost Collectors and orders are debited within actual costs during production.

For SAP FICO Training needs, visit: Call  515-978-9036 or email

Name  :
Email  :
Phone  :
Message  :
Captcha  :

Attend a Live WEBINAR about BA with Healthcare Training on 02-Feb-2016 @7:30 PM CST ‪#‎ZaranTech‬


Time : Tuesday Feb 02nd , 2016 @ 7:30 pm CST

You are most welcome to join our Upcoming batch, details of the same is as follows:

Attend a Demo Session
: Click here to Register
Batch Start Date : 02nd Feb, 2016 @ 7:30 pm CST
For More Information
Class Schedule : Mon-Thu 7:30 pm CST 3hrs each session

Contact : Lakshmi @ 515-309-2128, Email :

Demo Video by Trainer Anil

Attend a Free Live WEBINAR about BA with Healthcare Training on 02-Feb-16 @7:30 PM CST. Register Link  –

What is Hadoop good for what it is not?

This article mainly explains about advantages and disadvantages of Hadoop. As the pillar of so many implementations, Hadoop is practically synonymous with big data. Offering dispersed storage, higher scalability, and ultimate performance, many people view this as the standard platform for high volume data infrastructures. To learn more about Hadoop, click on Hadoop Certification.


 Advantages of Hadoop

The following are the advantages of Hadoop:

  • Scalable: Hadoop is a highly scalable storage platform, because it can store and distribute large volume of data sets across hundreds of economical servers that perform in corresponding. Unlike traditional relational database systems (RDBMS) that can’t measure to route large amounts of data, Hadoop assists businesses to run applications on thousands of nodes involving thousands of terabytes of data.
  • Cost effective: Hadoop allows businesses to simply access new data sources and rap into different types of data (both structured and unstructured) to generate value from that dataset. This means businesses can use Hadoop to develop valuable business visions from data sources such as social media, email conversations.  Hadoop can be used for a wide range of purposes, such as log processing, recommendation systems, data warehousing, and market promotion analysis and fraud detection.
  • Fast: Hadoop’s exclusive storage method is based on a distributed file system that basically ‘maps’ data anywhere it is located on a cluster. The tools for data processing are frequently on the same servers where the data is located, resulting in much faster data processing. If you are working with big sizes of unstructured data, Hadoop is able to capably process terabytes of data in just minutes, and petabytes in hours. To learn more about HDFS, click Big Data Hadoop Certification.
  • Resilient Feature: Fault Tolerance is the significant advantage of using Hadoop. During failure, when data is sent to a specific node, data is replicated to other nodes in the cluster.


Here are the disadvantages of Hadoop namely:

  • Security Concerns: Managing multifaceted applications such as Hadoop can be challenging. A simple example can be seen in the Hadoop security model, which is disabled by default due to absolute complexity. If whoever managing the platform lacks of know how to enable it, your data could be at huge risk. Hadoop is also missing encryption at the storage and network levels, which is a major selling point for government agencies and others that prefer to keep their data under wraps.
  • Vulnerable by Nature: Speaking of security, Hadoop makes running it a hazardous suggestion. The framework is written almost entirely in Java, which is one of the most widely used but yet, the controversial programming languages in existence.
  • Not Fit for Small Data: All big data platforms are not suited for small data needs whereas big data is not exclusively made for big businesses. Unfortunately, Hadoop is one of them. The Hadoop Distributed File System (HDFS) lacks the capacity to efficiently support the arbitrary evaluation of small files due to its high capacity design. As a result, it is not recommended for organizations with small quantities of data.
  • Potential Stability Issues: Like all open source software, Hadoop has had its share of problems on stability issues. To avoid these issues, organizations are intensely endorsed to make sure they are running the latest stable version, or run it under a third-party vendor equipped to handle such problems.

To know more about implementation of big data Hadoop, click on Hadoop Big Data Online Course.

For Big Data Hadoop Training needs, visit: Call  515-978-9036 or email

Name  :
Email  :
Phone  :
Message  :
Captcha  :

Attend a Live WEBINAR about Salesforce Training on 27-January-2016 @7:30 PM CST ‪#‎ZaranTech‬


Time : Wednesday January 27th , 2016 @ 7:30 pm CST

You are most welcome to join our Upcoming batch, details of the same is as follows:

Demo Date : 27th January, 2016 @ 7:30 pm CST
Class Schedule : 31st January, Sat & Sun 11:30 am CST 3 hrs each session
Attend a Live Demo Session
: Click here to Register

Contact : Richard @ 515-309-2098, Email

Demo Video by Trainer Amit

Attend a Live WEBINAR about Salesforce Training on 27th-January-16 @7:30 PM CST. Register Link  –

Attend a Live WEBINAR about PMP Training on 28-January-2016 @7:30 PM CST ‪#‎ZaranTech‬


Time : Sunday January 28th, 2016 @ 7:30 pm CST

You are most welcome to join our Upcoming batch, details of the same is as follows:

Attend a Live Demo Session Click here to Register
Batch Start Date : 30th January , 2016 @ 9:00 am CST
For More Information
Class Schedule Sat-Sun 9:00 am CST

Contact : Mike@ 515-309-2159, Email :

Video by Trainer Hasnain

Attend a Live WEBINAR about PMP Training on 28-January-16 @7:30 PM CST. Register Link  –

Which better Serves your Big Data Business Needs?

bigdata needs


Big data is a powerful tool for business which is looking to improve huge volumes of data for competitive advantage and profit as well. Companies must choose either one of the platform, so that it fits in all solutions for the big data problems.

To know which functionality will serve the business use case, the following are the questions need to be asked while choosing traditional system, Big Data Hadoop (including cloud-based Hadoop services, Qubole).

Questions #1: What type of data is being analysed? (Structured or unstructured)

Structured data is data that occupies within the fixed confines of a file or record. Even in large volumes, the data can be entered, stored, queried and analysed in a simple manner. Traditional database will better serve this type of data.

For example, enterprise resource planning, backup storage for large volumes of data etc.

Semi structured data is data that is not organised into special repository such as database. These data are neither raw data nor types in the conventional database. This type of data is used in data integration.

For example: Web logs that track website activity and call center logs with toll etc.

Unstructured data is data that comes from various sources such as photos, emails, text document, audio files and social media etc. As unstructured data is complex and large in volume, traditional database cannot serve this efficiently.

For example: Facebook, LinkedIn, Logs, Web chats, YouTube etc.

Without structuring the data, Hadoop has the ability to join, aggregate and analyse the multiple data source. Thus Hadoop is the perfect tool for the companies, who are looking to store, manage and analyse large volumes of unstructured data.

Questions #2: Which database system serves better for scalable analytics infrastructure?

Traditional database will serve better for the companies whose workloads are constant and expected.

Scalability allows servers to accommodate increasing demands of workload. Hadoop infrastructure will assist for the companies who have increasing data demands.

Questions #3: Which database system implementation is cost effective?

Cost-effective is the main concern for those companies who are looking to adopt new technologies. While implementing Hadoop, companies must realize advantages of Hadoop development more significant than the cost. Otherwise, Traditional database is best for fluctuating workloads and to meet data storage.

Nowadays, companies implement hybrid systems which integrate both Hadoop platforms and traditional database to improve the benefits of both the platforms.

Questions #4: Is fast data analysis is critical?

For large data processing, Hadoop was designed and address every file in the database. This process takes time. Fast performance is not critical for few tasks such as, performing analytics, end of day reports to review daily transactions and scanning historical data.

In other scenarios, Traditional database is the better option for the companies who rely on time-sensitive data analysis. Because traditional database performs well for analysing smaller data sets in real.

Some companies use hybrid systems, where small time-sensitive data sets are relied on Traditional database and Hadoop is used to process huge, complex and highly interactive workloads in the companies.

Questions #5: Which approach fits best?

It always depends on companies, as big data analytics providing deeper insights which are leading to real competitive advantages. Based on the persistent and careful work or effort by the companies, the best tool that fits in need is Hadoop.

How to become a Certified Salesforce Developer?

The Salesforce Developer certification validates skills in designing custom –made applications and analytics aided by the declarative features of this platform. The person with salesforce developer certification can modify salesforce applications, business processes and maintain high degree of workflows.

The Salesforce training is suitable for organizations where customers acknowledge them, help their sales representatives with superior account planning, provides customizable vision of sales, and good understanding of sales trends.

This certification is most suited for:

  • Individuals looking forward to build a career on salesforce
  • Developers
  • System Administrators
  • Sales Representatives

Here is the path for Salesforce Developer Certification, before get we know detailed information about the course details.


The demand for Salesforce developers is growing rapidly and companies are looking for certified professionals. This has 3 levels of developer certification namely:

  • Developer: This certification is a base-level accomplishment, in which you learn how to use the declarative tools (point and click) on the platform.
  • Advanced Developer: In this certification, you will learn how to create apex triggers, custom visual controllers and to integrate using API’s. The Certified Developer credential is the prerequisite for Advanced Developer credential.
  • Architect: In this you can access design secure, high –performance technical solutions on the platforms. Certified Salesforce Developer is a prerequisite for this certification.

What are the course objectives?

Once you complete this salesforce training, you will be able to:

  • Deploy applications and manage changes on platform
  • Develop applications using declarative tools
  • Implement Automation, debugging, data validation and customization of applications
  • Get an overview of Visualforce, sites and Apex

Career Growth

Today, the job market is competitive and it is important that you should add value to your curriculum vitae before you submit for job consideration. Salesforce statistics show that professionals in this profile are in demand. The pay for salesforce is high and Salesforce developers are the most in demand professionals as they hold the recognizable credential. Organizations are looking for individuals who can help them with good competitive skills. Salesforce Developer certification credential gives an edge over your peers.

Job Trend in Market

Salesforce provides an amazing set of sales, service, marketing, collaboration and analytic capabilities. Salesforce Developer can create customizable solutions to organizations. In India, salesforce is used by small companies to large enterprise companies.

  • There are 200,000 customer companies who have implemented salesforce platform.
  • Companies are thinking about innovative way to use Salesforce platform by using Salesforce system integrators.
  • There are 2000 companies which has implemented AppExchange solutions which are built on the top of the Salesforce platform.

The following image shows the trends in Salesforce jobs:salesforceSource:

If you are interested in building up your Salesforce skills, click this link to find online Salesforce certification that you can kick start your career and to demonstrate your expertise.

Why Project Management is known as a recognized and strategic organizational competence

‘At its most fundamental, project management is about people getting things done,’
Dr Martin Barnes, APM President 2003-2012.

Before we understand, why project Management is known as a strategic organizational competence. Let’s understand definition of project, project management and the phases of project management.


Project is a temporary effort carries on creating a unique product, service or result. It defines start and end time. It also defines scope and its resources.


It is unique, transient endeavour which is undertaken to accomplish planned objectives but not routine operation. It includes set of tasks to achieve single goal. This is defined as outputs, outcomes or benefits. For example, software development for an enhanced business process, construction of bridge, expansion of sales into a new market.

The project seems to be successful if it achieves the objectives such as acceptance criteria within an agreed budget and timeline.

Project Management

The application of methods, process, skills and experience to accomplish the project objectives is called project management. It is a science of organising the components of new project, whether the project is product development, new service launch, or organising trip.

Phases of Project


1. Definition: Project manager defines the scope of the project, list of deliverables and the outcome of the specific activities.

2. Planning: In this phase, a project plan or project charter may be in writing and designing the outline of the work. The team should prioritize the project; calculate budget and schedule, and finding resources.

3. Execution: The project manager must know how many resources and how much budget to work with the project. Then manager will assign the resources and allocate budget to the teams.

4. Control: The Project manager will compare the status and progress of the project to the actual plan to perform the scheduled work.

5. Closure: Once the project is successfully completed and the customer has approved the deliverables, an evaluation takes place to highlight success of the project and learn from project history.


The information about project management induces on the areas such as integration, scope, time, cost, quality, procurement, human resources, communication, and risk management. All management is fretful with these. But project management brings a distinctive focus shaped by the goals, schedules and resources of each project.

Thus the value of that focus is ascertained by the rapid, global development of project management as a recognized and strategic organizational competence, subject for training and education, and as a career path.

Importance of PMP Certification

The value and importance of PMP training and certification training is hotly debated among the project management community. The estimation and statistical evaluation of the Project Management certification by IT recruiters and corporate executives reports as an indicator of project success.

Attend a Live WEBINAR about QA Training on 21-January-2016 @7:30 PM CST ‪#‎ZaranTech‬

Quality Assurance Training

Time : Thusday January 21st, 2016 @ 7:30 pm CST

You are most welcome to join our Upcoming batch, details of the same is as follows:

Attend a Live Demo Session : Click here to Register
Batch Start Date : 23rd January, 2016 @ 07:30 pm CST
For More Information :
Class Schedule : Mon-Thu 7:30 pm CST (3-4 hrs each session)

Contact : Lakshmi @ 515-309-2128, Email :

Demo Video by Trainer Sridhar

Attend a Live WEBINAR about QA Training on 21-January-16 @7:30 PM CST. Register Link –