Welcome, Guest: Register On Nairaland / LOGIN! / Trending / Recent / New
Stats: 3,163,411 members, 7,853,810 topics. Date: Saturday, 08 June 2024 at 03:21 AM

Sarahjohns388's Posts

Nairaland Forum / Sarahjohns388's Profile / Sarahjohns388's Posts

(1) (of 1 pages)

Computers / Advance Your Career With Oracle Apps DBA Training In New York Free Demo Tutorial by sarahjohns388: 1:05pm On Oct 25, 2018
Oracle Database 12C Architecture

This section highlights the concepts and a general overview of the players that deliver this proprietary database management system. These players fall into three main categories:

Shared memory a section of the host server’s memory through which all the data passes and the applications’ code is stored and executed

System support infrastructure a mix of background and foreground processes that perform the tasks required to facilitate the application interaction with the Oracle Apps 12c database

Operating system files a suite of no less than ten files that play individual roles as the database runs.

The next three sections address these players and provide a bird’s-eye view of what they do.

Shared Memory

Shared memory is nothing more than a newfangled name for what was and is sometimes still referred to as RAM—random access memory. As the 12c database is started, a handful of entries in its system parameter file contribute to the size of memory allocated to the instance. Many adopters of the Oracle technology use the words “database” and “instance” synonymously. There is a fundamental difference between the two.

A database is an assortment of files that store data plus a handful of worker files that facilitate application access.
An instance is a segment of shared memory and support processes that provide the capability for applications to work with the data stored in the database. Once the instance is started, the following areas of shared memory play a role in database management activities:
The system global area, or SGA, contains data and control information from a single database instance.

The program global area, or PGA, is part of the memory allocated to a 12c instance as it is started. Unlike the memory in the SGA, PGA memory is not shared. It contains data and control information specific to server processes, not the instance as a whole.

The user global area, or UGA, is memory associated with each user session.
Even though it is allocated from PGA memory, the UGA is discussed as one of the four main memory components.
Software code areas are where SQL code is prepared for execution and sits in memory until used.
It would be impossible to get into the details of each of these components; as you encounter the memory structures that support a running Oracle instance, the terminology will not be brand new. Below Figure is a graphical representation of the bullet points just discussed with minimal drill-down.
overview of shared memory components for a 12c database
Oracle Database 12c offers two approaches for memory management-manual or auto:

Auto memory management a maximum amount of memory that can be used is defined, and the instance self-manages the size of the assortment of SG components. This approach is recommended by Oracle and first appeared in Oracle around the turn of the century; it is referred to as automatically shared memory management (ASMM).

Manual memory management The administrator specifies fixed sizes for the components that make up the SGA; each component size is specified in the system parameter file as the 12c instance starts.

Interaction with the database through the instance is brokered by system support processes introduced next.

System Support Processes

These processes are initiated automatically as an Oracle instance is started. Each one plays a role in the management of application interaction with the data.
Programming / Learn Oracle Apps Technical Training In Bangalore Free Demo by sarahjohns388: 12:10pm On Sep 26, 2018
About Training:

Mindmajix Oracle Apps technical 12x Training module makes you understand the architecture, table relations, designing reports and forms using Oracle Apps. It helps connect your business to cloud efficiently...etc.

Is there a certification exam for Oracle Apps Technical?

A significant number of certifications are available for Oracle Apps Technical. Starting from admins to consultants, Oracle has invested a huge amount of support for those working with this integrated suite of business applications. Some of the certifications exams related to various industries include:



Oracle CRM On Demand Essentials

Oracle Value Chain Planning: Demantra Demand Management 7 Essentials

Oracle Advanced Controls Applications 2014 Essentials

Oracle Unified Methods 5 Essentials



By opting for our well-structured training at Mindmajix, candidates can gain knowledge to enter the lucrative domain of Oracle Apps Technical and earn the certification.


For more information regarding various types of certification available, please go through the Oracle Apps Technical Certification.


Contact info:

USA : +1 201 378 0518

INDIA: +91 9246333245

email: info@mindmajix.com

Website: https://mindmajix.com/

Programming / Learn Best Uipath Certification Training By Experts In Bangalore by sarahjohns388: 8:16am On Aug 30, 2018
About Course:

UiPath Training has a lot of unique benefits for the business. One among them is its compatibility and simple operation. A lot of efforts can simply be saved by adopting UiPath Studio. The continuously improving scope and demand of RPA professionals has enabled many organizations to come up with some of the best automation tools. Enroll for UiPath Training Demo!

Why This Course?

Avg. Salary for UiPath Developer: $90,068 PA.

Used by top industries across various business Verticals. Ex: EXL Services, IBM etc.

Attend Free Demo Class Here !Mindmajix

Contact Us:

USA : +1 201 378 0518, +1 972-427-3027
IND - +91 9246 333 245
email: info@mindmajix.com
Url: https://mindmajix.com/uipath-training
Website: https://mindmajix.com/

Science/Technology / Is Hadoop The Next Generation Of The Database? by sarahjohns388: 7:53am On Apr 20, 2018
At the time, a long line of startups were offering a new breed of database designed to store and analyze much larger amounts of data. Greenplum. Vertica. Netezza. Hammerbacher and Facebook tested them all. But they weren't suited to the task either.

In the end, Facebook turned to a little-known open source software platform that had only just gotten off the ground at Yahoo. It was called Hadoop, and it was built to harness the power of thousands of ordinary computer servers. Unlike the Greenplums and the Verticas, Hammerbacher says, Hadoop could store and process the ever-expanding sea of data generated by what was quickly becoming the world's most popular social network.

Over the next few years, Hadoop reinvented data analysis not only at Facebook and Yahoo but so many other web services. And then an army of commercial software vendors started selling the thing to the rest of the world. Soon, even the likes of Oracle and Greenplum were hawking Hadoop. These companies still treated Hadoop as an adjunct to the traditional database – as a tool suited only to certain types of data analysis. But now, that's changing too.

On Monday, Greenplum DBA Online Training – now owned by tech giant EMC – revealed that it has spent the last two years building a new Hadoop platform that it believes will leave the traditional database behind. Known as Pivotal HD, this tool can store the massive amounts of information Bigdata Hadoop was created to store, but it's designed to ask questions of this data significantly faster than you can with the existing open source platform.

"We think we're one the verge of a major shift where businesses are looking at a set of canonical applications that can't be easily run on existing data fabrics and relational databases," says Paul Martiz, the former Microsoft exec who now oversees Greenplum. Businesses need a new data fabric, Maritz says, and the starting point for that fabric is Hadoop.

That's a somewhat surprising statement from a company whose original business was built around a relational database – software that stores data in neat rows and columns. But Greenplum and EMC are just acknowledging what Jeff Hammerbacher and Facebook learned so many years ago: Hadoop Training in New York– for all its early faults – is so well suited to storing and processing the massive amounts of data facing the modern business.

What's more, Greenplum is revamping Hadoop to operate more like a relational database, letting you rapidly ask questions of data using the structured query language, or SQL, which has been a staple of the database world for decades. "When we were acquired [by EMC], we really believed that the two worlds were going to fuse together," says Greenplum co-founder Scott Yara. "What was going to be exciting is if you cold take the massively parallel query processing technology in a database system [like Greenplum] and basically fuse it with the Hadoop platform."

The trouble with Hadoop has always been that it takes so much time to analyze data. It was a "batch system." Using a framework called Hadoop MapReduce, you had the freedom to build all sorts of complex programs that crunch enormous amounts of data, but when you gave it a task, you could wait hours – or even days – for a response.

With its new system Greenplum has worked to change that. A team led by former Microsoft database designer Florian Waas has designed a new "query engine" that can more quickly run SQL queries on data stored across a massive cluster of systems using the Hadoop File System, or HDFS. Open source tools such as Hive have long provided ways of running SQL queries on Hadoop data, but this too was a batch system that needed a fair amount of time to complete queries.

This query engine will make its debut later this year as part of Pivotal HD. Greenplum is now a key component of an EMC subsidiary called The Pivotal Initiative, which seeks to bring several new age web technologies and techniques to the average business.

This time, Greenplum is in lock-step with Jeff Hammerbacher. After leaving Facebook, Hammerbacher helped found a Hadoop startup known as a Cloudera, and late last year, he unveiled a system called Impala, which also seeks to run real-time queries atop Hadoop. But according to Waas and Yara, Pivotal HD is significantly faster than Impala and the many other tools that run SQL queries atop Hadoop. Yara claims that it's at least 100 times faster than Impala.

The caveat, says Waas, is that if a server crashes when Pivotal HD is running a query, you're forced to restart the query. This is a little different from what people have come to expect when running jobs at Hadoop, which was specifically designed to keep running across a large cluster of servers even as individual machines started to fail – as they inevitably do.

"The query extensions of Pivotal HD behave slightly differently in that they require a restart of the query when a machine is lost," he says. "An individual query needs to be restarted but the integrity, accessibility and functionality of the system is guaranteed to continue. We consider this a small price to pay for several orders of magnitude performance enhancement as we do not materialize any results during processing."

The traditional database will always have its place. Even Greenplum will continue to offer its original data warehouse tool, which was based on the open source PostgreSQL database. But the company's new query engine is yet another sign that Hadoop will continue to reinvent the way businesses crunch their data. Not just web giants. But any business.
Education / Advanced Qlikview Security Permissions By Mindmajix In New York by sarahjohns388: 11:21am On Apr 09, 2018
When developing dashboards that display sensitive information, there are often a variety of security requirements that must be considered during design and development. These requirements depend on the number and types of users and the structure of the business. Very commonly, some users should have access to only a subset of data while others should be able to view all of the data. This is known as row-level security.

In this article, we will walk through such a scenario that we encountered when developing a dashboarding and reporting solution for a healthcare company’s High Performance Contact Center. This solution leveraged the QlikView reporting tool and the information below should be useful for QlikView Certification Training who encounter similar security requirements.
The contact center’s hierarchical structure required the following row-level security:

Any non-agent user (i.e. a user that doesn’t take calls; typically supervisors and executives) that has access to view the application should be able to see all data, with the ability to drill-down to see data on specific agents.
Agent users should be able to see detailed data for themselves, no detailed data on other agents, and only aggregated data for their team.
There are a variety of methods to apply row-level security within QlikView, but we found that the Section Access security model was the simplest to implement and maintain given the requirements provided.
Every transactional record within our data model is specific to an agent denoted by the EmployeeID field. We used this field as the reduction field within our Section Access security model. This reduction field filters the data for each user by associating to the data model upon accessing the application and reducing the available data based on the reduction field values related to that user. This made it possible to provide Supervisor users access to all employee data through the security model by using the asterisk symbol (*) in the reduction field denoting all values listed, and provide Agent users access only to their own data by explicitly listing their EmployeeID in the reduction field.
NOTE: whenever Section Access is used, all reduction field names and values must be completely uppercased to produce consistent results. Forgetting this step can lead to incorrect reduction of the data.
The Section Access security model looked similar to the one below, where the top table represents the Section Access table, and the bottom table represents the Section Application table, both of which are necessary for providing access and filtering the data model.

his security model allows users WMP\Employee3 and WMP\Administrator access to all values listed within the reduction field (Employee1 and Employee2 data), hence the ‘*’ denoting All’. This indicates that users WMP\Employee3 and WMP\Administrator are both Supervisors in terms of access permissions. Employee1 and Employee2 have access to only their own data since they have a specific value listed in the EmployeeID column denoting their respective EmployeeIDs.

While we needed to limit agent users to their own detailed data, we also needed to allow them to compare their performance against their team as a whole. This required us to create aggregated tables in our data model at the agent- and team-level that were not associated to each other. The data model must be split at these levels to ensure that users do not unintentionally filter the data in a way that distorts the comparative metrics.

Currently, our EmployeeName and TeamName columns are in the same table, thus relating them. Even if they were split into two tables but associated on a unique identifier, e.g. TeamID, they would still be implicitly related. With this structure, we would not be able to accurately give Employee2 the total calls for his or her Team1 on the date 7/11/2016. Barring the fact that Employee1’s records would correctly be removed from the data model with the Section Access settings in place, the application would not display the correct call total of 50 for Agent2’s team on 7/11/2016 anyhow because Agent2 simply did not have data for that day, even though other members of his team did.

Avoiding this issue requires a two-step mitigation. First, the data model needs to be split so that the team metrics and the agent metrics are not associated with eachother. We used the configuration below, where the team metrics table on the bottom is an aggregated version of the employee metrics table on the top, only without employee-specific information.

Enroll here for free demo class! Mindmajix
Education / Microstrategy: Longevity And Re-invention by sarahjohns388: 5:36am On Mar 24, 2018
Last month, MicroStrategy System hosted its 20th user conference in Washington, DC, with over 2800 attendees. There are only a couple BI and analytic vendors that can lay claim to such longevity – most have been acquired or are new to the space.

It’s a different landscape than 20 years ago, when I first discovered DSS Suite at a conference in Germany. It’s a different landscape even to five years ago, when IT drove most of the buying and the themes of enterprise grade, standardization, and single-version of the truth were more the mantras. Now, self-service, agility, and ease of use often top BI buying priorities— priorities and budgets increasingly controlled by the business. These changes are forcing MicroStrategy to re-invent itself.

CEO and founder Michael Saylor kicked off the keynote announcing that MicroStrategy is now available on Amazon Web Services. The vendor was early to the cloud, initially running in its own data centers and offering trial versions leveraging Amazon. This new capability gives greater flexibility for deployment and elasticity, allowing customers to deploy in less than an hour. I also noticed that Saylor sported a conservative suit, in contrast to prior years in attire more apt for party. It seemed a subtle message, a shift in tone that aligned with Saylor’s emphasis on MicroStrategy’s financials: the company has been profitable on an annualized basis for at least a decade, with some quarterly losses last in 2014, but margins have greatly improved the last two years (neither Tableau or Qlik were profitable the last two years on GAAP basis). Note: we consider such things in the Magic Quadrant under Vendor Viability and you can use the interactive version to see the impact of this on dot placement. However, this is where any public company has two different stakeholders: shareholders who want growth and customers who want great products, innovation, and best-in-class support. Maybe both stakeholders interests are aligned over the long term, but the time horizons are often drastically different. The return to profitability has improved MicroStrategy’s financials, but this, in part, has disrupted support and operations, as we wrote about in the MQ. At the conference, the vendor shared a number of initiatives to improve these things, including a revamped community site.

MicroStrategy was also early to develop Visual Insight in response to the rise of Tableau. But so far, it has failed to crack the ease of use requirement, and initially, the agility part, although agility was largely addressed in version 10 as described in the Critical Capabilities note (or this older note, deep dive on MicroStrategy 10). Ease of use then was a key part of CTO Tim Lang’s future’s keynote on day two. The company introduces a new concept of Dossier to allow users to rapidly assemble content into briefing books, with familiar navigation concepts like a table of contents. There is a lot in here that looks promising, such as natural language query and telemetry to recommend popular content. On this point, I was also impressed with customer Domtar and Alisha Witty’s presentation on how they are using data to track usage and better design dashboards.
Programming / Certification For Bluecat Fundamental Training by sarahjohns388: 6:37am On Mar 02, 2018
About Bluecat Fundemental Training:

Mindmajix BlueCat Fundamentals training prepares students to manage IP Address Space, DHCP and DNS through Address Manager. The training guide students through Address Manager and show them how to optimize their IP space. The entire course is comprehensive with real world scenarios delivering practical value to aspirant professionals.

Course Features


30 Hrs of intructor led training
Lifetime access to recorded sessions
24*7 Support
Real world use-cases
Certified Trainers

Key Features

30 hours of Instructor Led Training
Lifetime Access to Recorded Sessions
Practical Approach
24/7 Support
Expert & Certified Trainers
Real World use cases and Scenarios

Contact Info:

USA : +1 201 378 0518, +1 972-427-3027
email: info@mindmajix.com
Website: https://mindmajix.com/

(1) (of 1 pages)

(Go Up)

Sections: politics (1) business autos (1) jobs (1) career education (1) romance computers phones travel sports fashion health
religion celebs tv-movies music-radio literature webmasters programming techmarket

Links: (1) (2) (3) (4) (5) (6) (7) (8) (9) (10)

Nairaland - Copyright © 2005 - 2024 Oluwaseun Osewa. All rights reserved. See How To Advertise. 53
Disclaimer: Every Nairaland member is solely responsible for anything that he/she posts or uploads on Nairaland.