Tips for Planning Your Next Student Trip

Overview

Source: Freepik

In this post, we I will be discussing a topic that’s on a new important part of life: Travel. I will be showing the urgency that we have for us to be talking about this topic, why is travel important in general for any individual’s progress and flourishment. Then, will be seeing how it is most relevant today to students. This comes as students are the people that have curiosity the most and are open to learning from those travel experiences the most. I will be pointing out some of the generally faced challenges and obstacles that come across most students whenever they start dreaming of crossing their bubble’s borders. After that, I will be throwing out some quick tips for people who feel they need a thread in the dark to start from. Finally, closing with some final thoughts on

 

As the school years go in students across the world are imaging, planning, balancing budgets and joining other students in traveling around the world. With all the preparations that go into any trip -be it a weekend trip to a local town or an around the world summer trip, planning a trip might not be your best expertise or part of your past lived life. Well, first things first, yes… It’s not that simple! However, that doesn’t mean that it’s too complicated and demanding either. I believe that it takes two things only for a successful trip, patient and detailed planning on one side, flexible and reflective execution.

 

Why talk about travel?

As simple as this question might sound like, the answer follows along similar fashion. There’s simply no better substitute for getting to know the world with its diverse nations and different places first-hand. Travel is such a life-changing experience for people’s personalities and experiences, political, economic, social, religious and moral beliefs and thoughts, global awareness and human-to-human-level empathy. It is quite important for people to step out of their comfort zone, and experience different new things outside their usual context and surroundings.

 

For a deeper discussion on the importance of travel, check this article.

 

Why is it important for students to travel?

Whether it’s a semester abroad with a university, an international internship, voluntourism program, or independent travel, timing is key. Many students have way more control of their education and life experiences than they imagine. The power to decide how and where students want to learn is increasing dramatically with the increased understanding of the importance autonomous planning brings to people’s set of life skills. The difference between people who dream about traveling and people who actually do it is simply the decision.

 

Students should be looking out for experiential learning scenarios and experiences. The most immersive experiential learning scenarios occur when students are placed in completely foreign environments and are put under a certain type of pressure or exposed to a new culture and lifestyle. One of the best things about travel is, undoubtedly, the friendships and connections one can make with individuals, local communities and people from across the globe.

Challenges

Travel can be significantly harder than anticipated, but the learning curve is usually fast. Those are some common challenges often faced by students traveling in different contexts, for student travelers.

 

Having little or no access to one’s usual support system outside their normal sphere, is one of the most common challenges. Living abroad is all good when everything is going on a plan. But at those special moments when you feel that a rug was pulled from under you, one can really feel on their own. The usually available support system. In most cases, people realize that they weren’t previously aware of how important that support system was, until they leave its sphere of influence. This challenge can be easily, however, turned into an adventurous mission of exploration and discovery, building up a new support system takes patience and care, too.

It goes almost with no doubt that wherever one is from, and whatever new context they are  going to, it’s very common to feel like an outsider at times, sometimes alone while reflecting back on the day, and others in the middle of local store when one of the customers cracks a joke that just passed you, while everyone else is cracking out loud while staring at you not even realizing that someone had told a joke.

 

Tips

Make some Local Friends.

Making friends along the journey is one of the most enjoyable aspects of traveling. Some of those friends are ones that will be interacted with for very brief moments and others might accompany you along the journey. Most certainly many will stay in your memory forever and might be part of one’s future life in one way or another. Especially when one is traveling with a group. Mixing with the locals has serious advantages when it comes to learning more about the place, its culture, and picking up language skills.

 

Budget out your whole travel period.

Wrapping up your head around the finances and the gives and takes of your whole trip is fundamental for having a well-paced trip that would not overuse all the available resources before the staying period finishes or resupplying is possible. Emergency money should always be thought about and kept in the plan.

 

Choose a wise Location as your Destination.

This simply means that the student should have any kind of a desire or curiosity in the targeted location, as it is key for a robust learning curve. The route to the destination is equally important, however, in terms of choice of transportation means, paths, and services.

 

Create and organize the Itinerary.

This might be a sketch or plan of the day-by-day itinerary of the whole trip to and from the destination before anything is booked. Research of sites and cities one might be interested in exploring should be patiently done, and then the decision can be made on what stays in the itinerary and what gets dropped done.

 

For more detailed tips, you can check this website.

 

Closing Thoughts

There are few times in someone’s life when they’ll have few commitments, no rentals, no partner, no kids. Students need to make the most out of every travel opportunity that passes by their sight, to advance their level at a certain desired field, expand their network of connections and learn about others’ cultures and history.

 

In conclusion, we have discussed the importance of travel in general for any individual, and especially its importance and impact on students. Later on, we previewed some of the common obstacles that tend to come across student travelers’ paths. That was followed by some brief tips that would ease a student’s trip if taken into consideration. Finally, concluding with some final thoughts on the preciousness of utilizing the time when people are at their brightest stages, college days.

 

References

9 Reasons why Traveling Is Important in Life

https://www.theodysseyonline.com/top-reasons-traveling-important-life

 

7 Budget Travel Tips for Student Travelers https://theblondeabroad.com/7-budget-travel-tips-for-student-travelers/

 

The Power of Travel for Student Success

https://www.edweek.org/tm/articles/2018/01/17/the-power-of-travel-for-student-success.html

 

Travel Tips for Students, From a Student

https://www.nytimes.com/2012/11/25/travel/travel-tips-for-students-from-a-student.html

Defining and Exploring Deep Learning

Overview

Source: ZDNet

In this post, we will be discussing the definition of Deep Learning, the general ‘neural’ way in which it operates. We will be seeing how it is quite relevant today and the growing interest in its theory, capabilities and possible future utilities. Then, we will be previewing a number of applications that Deep Learning is currently being in or are developed. We will be discussing the great potentials and certain limitations that this topic entails, and how is the future of deep learning might be looking like. Finally, closing with some final thoughts on the Deep Learning and its possible outcomes.

 

Definition

Deep Learning is, relatively, a new area in Machine Learning. It has been introduced and developed with the objective of moving Machine Learning closer to one of its original goals: Artificial Intelligence.

 

It vaguely models itself in a fashion that is similar to the complex and non-linear way the human brain categorizes and analyzes information and makes future predictions based on the learned data. Unlike traditional Machine Learning, Deep Learning attempts to simulate the way the human brain works, learns and processes information by creating artificial structures, called “neural networks”, that can extract complicated concepts and relationships from data, then build and extrapolate on it. Deep learning models improve through complex pattern recognition in pictures, text, sounds, and other data to produce more accurate insights and predictions.

 

Some good examples of Deep Learning frameworks are Apache MXNet, Microsoft Cognitive Toolkit, Torch. Other, newly, emerging ones are eightfold.ai, and missinglink.ai.

 

Why is it Relevant Today?

The fastly-growing importance of AI and Deep Learning can be seen, for example, by observing the investment value in the AI market. Venture Capital investment in AI doubled in 2017, attracting $12B compared to $6B in 2016, and is expected to reach $58B by 2021. Companies are increasingly turning to Deep Learning because it allows computers to learn independently and undertake tasks with little, or no, supervision, promising extraordinary abilities benefits for science, industry, communication, and security.

 

For a good reading on its relevance and importance, check this article.

 

Use Cases Examples

Deep Learning applications are all around the place, I’ll preview a few of them here.

 

Automated Driving: Companies building self-driving cars, like Tesla and Google, are basically teaching a computer how to take over key parts, or all, of the driving process, relying on its experience in interpreting and reacting to the digital sensors system attached to it. Researchers are using deep learning to automatically detect and appropriately interpret objects such as stop signs, traffic lights, and pedestrians.

 

Health Care: From breast or skin cancer early detection methods to creating personalized medicine on the basis of a Biobank’s data, deep learning will be reshaping the way our health systems function. Innovations in AI are advancing the future of precision medicine and population health management in extraordinary ways. Computer-aided detection, quantitative imaging, decision-making support tools, and computer-aided diagnosis are some of the big focuses of in the field.

 

Aerospace and Agricultural: Finding the optimal routes for spacecrafts and satellites and the precise maneuvers to take, optical analysis of forested or planted lands for early detection of diseases and fires, locating icebergs by analysing satellite images of the path of ships, to identify, locate and analyze objects of interest over massive databases of satellite images, and many more are some of the applications currently being researched on.

 

Electronics: Deep learning is being already widely used in automated hearing, speech and text recognition, translation, and computer vision. Alexa, Siri, Google Now and Cortana are few examples of some of those daily encounters people have with deep learning applications.

 

Future?

The prospects are intriguing, clearly, as companies like Google, Microsoft and Facebook are spending millions and millions on research into advanced neural networks and deep machine learning. This comes with numerous promising projects and applications that would reshape the way humans interact with AI at daily basis.

 

In the meantime, significant hardware and algorithmic developments have been underway, building up to what appears to be a race of new applications for deep learning frameworks in areas as diverse as energy, medicine, physics, and beyond.

Some of the bright current research projects are working on spotting invasive brain cancer cells during surgery reducing the error margin critically, Restoring the colors of Black and White pictures and films, Pixel restoration for faces of low resolution producing semi-original versions and many more.

 

Closing Thoughts

We have seen what Deep Learning is, and what some of the valuable technologies that are driven by some of those ‘neural networks’ can do and what, certainly, great powers some of them have. But that doesn’t mean that we understand much about the exact way in which those deep multi-layered systems work or can be made to necessarily converge to a solution within a finite time.

Humans understand the tremendous potentials in embracing AI that could possibly become smart and powerful enough to become part of our life. However, people must be aware of what’s wanted from these smart machines, as they are not solely serving our interests, at least we have no guarantee of that so far.

 

In conclusion, we have gone through the definition of Deep Learning, the way it works and its growing importance and popularity among investors, tech companies and new startups. Later on, we went into a thread of various applications and some possible use cases. Concluding with some final thoughts on certain limitations in terms of ambiguity and complexity.

References

Deep Learning

http://deeplearning.net/

 

What is Deep Learning? 3 things you need to know.

https://www.mathworks.com/discovery/deep-learning.html

What is deep learning? Everything you need to know

https://www.zdnet.com/article/what-is-deep-learning-everything-you-need-to-know/

 

25 Machine Learning Startups to Watch 2018

https://www.forbes.com/sites/louiscolumbus/2018/08/26/25-machine-learning-startups-to-watch-in-2018/#7c202c266f99

 

30 Amazing Applications of Deep Learning

http://www.yaronhadad.com/deep-learning-most-amazing-applications/

What is Software Composition Analysis and Why is it Important?

What is Software Composition Analysis and Why is it
Important?

 

Source: Beta News

Overview

In this post, we will be discussing the definition, relevance and the growing importance of Software Composition Analysis. Then, we will be focusing further on the three steps it took to reach its current state. Finally, we will be viewing and explaining the Maturity Model and its importance in addressing software organizations with their level of security. Finally, closing with some final thoughts on SCA and its future.

Definition

Software Composition Analysis solutions, or OSS Security Scanning, solutions and services provide the OSS audits by analysing the source code and files constituting the application in order to provide the company with the complete inventory of commercial, proprietary, and open source components, including all direct and transitive dependencies, used in that particular application. But in practice, it acts as an open source management tool only, because its real purpose is dealing with open sourced code.

 

Why is it Relevant?

SCA is definitely relevant to any enterprise that uses or utilizes any open-sourced applications, as many studies have shown in the recent years the growing vulnerability that open sourced applications have on the security of companies and their data. Also, as a matter of fact, open source components have become the main building block in software applications across the board. However, companies have shown many examples of security-related holes in their system such as leaks in data, of theirs or their customers, because they were not dealing with the security of their applications properly and that they need a solution for this critical issue. SCA comes in to help the company identify and assess the risks that might arise later on from their open sourced libraries, working on both security and license risks, reducing the amount of work and expenses on future security defects.

 

Moreover, the fact that companies are understanding the real need for such a system can be seen by looking at the market. It is expected to grow even further in the coming years. The software composition analysis market is expected to grow from around USD 150 million in 2017 to about USD 400 million by 2022, at a CAGR of 20.9% during the forecast period.

 

For more information on the relevance and the future of SCA, check this blog post.

 

Software Composition Analysis

SCA tools can be seen to constitute three phases marked by the differences they have in their technological advances.

 

1st Generation: Open Source Code Scanning

Since the early beginnings of the spread of open-sourced applications, open source code scanning has provided different companies with the ability to oversee their products and their security-related issues in their open source inventory. This was done by analyzing parts of their code and comparing it with already existing open source databases. However, this proved to not be that practical, when many false positives were being identified. This means that proprietary and commercial bits of code where being falsely recognized as open sourced, requiring other tools and services to recheck the results of the code scanning and identifying those false positives.

 

The backfalls of this system are various, however. Companies realized that soon after using it, as the scanning time is lengthy, its incompatibility with software development lifecycle (SDLC) continuously and the high error percentage (false positives) in its results. This has made it an ineffective tool for solving the issue of vulnerability in the software production world.

2nd generation: Continuous Open Source Components Management

Later on, a new technology was created to match the demand of modern agile production standards. Continuous management of open source components differs from its previous version in that it integrates with various software development tools, such as repositories, build tools, package managers and CI servers. It detects the open source components, dealing with vulnerabilities and licensing issues, in real-time.

This shift to real-time detection of vulnerabilities and licensing issues has benefited security specialists largely by allowing them to detect problematic bits of coding early on, making solving the issue easier and simpler.

3rd generation: Effective Usage Analysis

This newborn technology can supply the companies with much more than simply identifying the components that are present in the application. Investigating on a deeper level on how each of the components is being used, noting its effect on the security of the application, with suggested actions to be taken.

 

For more information on the three different generations, check this blog post.

 

Maturity Model

The maturity model support for legal, security and development specialists by detecting the existing gaps and direct future investment. It provides a place to start from, a benchmark that can be used to see where different companies stand, process maturity and business value and a description for what improvement actually mean to the company.

 

Security and license compliance maturity in a company is measured in relation to these four dimensions:

Vulnerability management: to avoid security defects arising from using third-party components.

License management: to manage open source license dependencies and minimize legal risks effect.

Obligation management: to manage obligations related to the use of open source software, based on associated licenses and company regulations.

Component management: to provide a view on what components are used, and integrate this view in usage and product roadmap decisions.

 

Closing Thoughts

We have seen that SCA is quite essential for any company that relies on open source code, which means the majority of today’s companies. This, however, does not necessarily mean that issue is solved. Because the purpose of this tool is more to help identify the issue in the code, rather than fixing it. The latter part should be addressed and solved by the company as it is their responsibility to uphold and maintain the security of their customers’ data from malicious bits of code. This means that we will be seeing new upgrades for this tool in the future to catch up with the speed at which the open source software universe is growing at. This means that companies need to be more and more careful with their apps.

In conclusion, we have gone through the definition of Software Composition Analysis and its growing importance and popularity among software companies. Later on, we dived deeper into the various generations of SCA with the different features they offer in terms of security and practicality, and concluding with some final thoughts on its future.

 

References

Software Composition Analysis and OSS Security Scanning featured on Gartner’s 2017 Top Technologies for Security

http://blog.klocwork.com/open-source/software-composition-analysis-and-oss-security-scanning-featured-on-gartners-2017-top-technologies-for-security/

 

Global Software Composition Analysis Market 2018-2022-High Adoption in the Fintech Sector

https://www.prnewswire.com/news-releases/global-software-composition-analysis-market-2018-2022—high-adoption-in-the-fintech-sector-300627191.html

 

Introducing the Software Composition Analysis Maturity Model

https://dzone.com/articles/introducing-the-software-composition-analysis-matu

 

Software Composition Analysis: Identify Risk in Open Source Components

https://www.whitehatsec.com/blog/software-composition-analysis/

Defining HTML5 & HTML5 Video

Defining HTML5 & HTML5 Video

Source: W3C.

Overview

In this post, we will be discussing the definition, usage and popularity of HTML5 across the Web. Then, we will be focusing further on HTML5 Video, what is it, its usage and why is it better to start using HTML5 player instead of using Adobe’s Flash. Finally, we will be viewing and explaining different types of HTML5 video players, what do they do and what special features they have.

 

Definition

HTML refers to HyperText Markup Language, which is a markup language that is used for structuring and presenting content of various types on the Web. HTML5 is the latest version of the HTML standard, published in 2014 by W3C with many new features added.

 

Usage & Popularity

HTML can be seen as the new successor for Web content management, especially after Steve Jobs announced in his “Thoughts on Flash” in 2014 that “Flash is no longer necessary to watch video or consume any kind of web content” and that “new open standards created in the mobile era, such as HTML5, will win”. This comes with the greatly enhanced functionality that HTML5 provides. Moreover, in July of 2017 Adobe announced to the world that Flash’s era will be done by the end of 2020, leaving the arena smoothly for other open formats, mainly HTML5. HTML5 standards have been implemented across all modern web browsers, and the need for Flash just isn’t there anymore.

 

HTML5 Video

One of the major new elements introduced by HTML5 specification: replacing the object element with The video element. Its creators have set up the path for it to take over the the video-content on the Web, replacing the need for Flash and its browser plugin. HTML5 does not specify for its users the video and audio formats that they should be using, which means that it is up for them to decide what formats to upload and support. However, the general guidelines advice that user agents should support Theora video, Vorbis Audio and Ogg container format; as they are generally widespread.

 

Using HTML5 instead of Flash is a process that has already started years ago, with the big companies, lead by the big search engines and social networks, such as Google, Facebook, Wikipedia and Youtube. Currently, about 72% of all the websites on the internet support HTML5, which shows the trend towards HTML, specially when compared to Flash’s 5%!

 

For more info on how do HTML5 and Flash compare, check out this article.

 

On the other hand, it is necessary to remind you that HTML5 on its own cannot be used for animation or interactivity, as it needs to supplemented with CSS3 or JavaScript.

 

HTML5 Video players

The following list of five video players will span the general needs of different users, ranging from fresh Youtube creators to successful web designers.

 

Elite Video Player

This simple, fully-customizable and highly accessible player can deal with self-hosted (only mp4), Vimeo, Youtube, Amazon S3 and Google Drive videos. It works across all browsers and also includes features such as: lightbox, responsive, and full-screen modes, Youtube channel and playlist support. But more importantly, it brings the ability to add pre-roll, mid-roll and post-roll video and popup ads.

 

VideoJS

A free and open source player that can fall back to Flash and other alternate online video player playback technology, such as Microsoft’s Silverlight. Its features include basic playback functions, such as autoplay and reload. But also includes skinning, where it is originally built from HTML and CSS but the user can change to a different skin, and Plugins; which can be used to enhance the performance and abilities of the player, some of the best plugins are Analytics, Playlist and Brand.

 

Cloudinary

This player is bundled with numerous features and functionalities. It provides an API for fast upload and a secure path for content privacy. It offers broad device support for both mobile and desktop web browsers. Moreover, it provides video manipulations that can be applied on both the player and the per-video levels. It supports mp4, mov, webm and ogv formats, and it can be specified to change the format of the video to the best format for the user’s browser. It is analytics-ready and is also highly customizable. For more info, check out Cloudinary’s site.

 

JWPlayer

Being on the top 5 players, together with VideoJS, for a long time, It boasts a large number of features from analytics to full HTML5 video controls. This player is fully themeable and it has an integrated API and supports 4k resolution, 360 degrees with Gyroscope motion support. It, also, offers a sizable number of features through its add-ons as well, such as advertising tie-ins, closed captioning and social networking tools.

 

Plyr

This simplistic player is quite popular among different users due to its lightweight design which smoothens the processing of large videos. It provides full support to screen readers and VTT. Moreover, it allows users to change the appearance and gives them high customizing abilities, and It, also, encloses a wide range of processing and editing elements.

 

For more players and details on them, check out this blog post.

 

Closing Thoughts

Adding a video to your webpage is not as hard as it used to be with Flash and Silverlight, as we have seen that HTML5 provides a new easier way that does not require a plugin built into the browser.

 

We have seen that there are many options for the player to be used for HTML5 video, but the one to use is your choice. Different clients have different needs and desires that they wish to be met, and it is up to them to decide which player fits their work frame the most. But I hope that my brief explanation can guide you through some of the widely used platforms.

 

In conclusion, we have gone through the definition of HTML5 and its growing popularity, specially when compared to Flash, and its usage. Later on, we dived deeper into HTML5 video players and have previewed some of the popular players, with the different and interesting features each one has to offer, and concluding with some final thoughts.

 

References

HTML5: What is it?

https://www.techradar.com/news/internet/web/html5-what-is-it-1047393

 

HTML5 Video Player

https://cloudinary.com/visualweb/display/IMMC/HTML5+Video+Player

 

15 Best HTML5 Video Players

https://code.tutsplus.com/tutorials/15-best-html5-video-players–cms-28589

 

The Advantages Of Using An HTML5 Video Player

http://blog.viewbix.com/the-advantages-of-using-an-html5-video-player/

 

Is Your HTML5 Video Player Fully Compatible?

https://www.dacast.com/blog/html5-video-player/

 

What are ETL Tools? A Definition and Comparison

ETL Tools
(Source: Windsor)

What are ETL Tools? A Definition and Comparison

Overview

In this post, I will be reviewing the ETL process and covering some major ETL tools, including various use cases and features. I will then discuss the relevance of ETL tools have in today’s ecosystem.

 

To get started with ETL, see this ETL wiki, which includes a collection of resources to get you acquainted with basic ETL and data integration concepts.

 

Definitions

What is ETL?

ETL is a process in data warehousing and  is responsible for pulling data out of the source systems and placing it into a linked data warehouse. This process includes the performance of three tasks: Extracting the data, transforming the data, and then loading the data.

Data must be properly formatted and normalized in order to be loaded into these types of data storage systems, and ETL is used as shorthand to describe the three stages of preparing data. ETL also describes the commercial software category that automates the three processes.

 

ETL Process

In ETL, the following three tasks are performed:

Extract:

the extraction process is the first phase of ELT, in which the desired data is identified and extracted from one or more sources, including database systems and applications.

Transform:

After data is extracted, it has to be physically transported to the target system or to an intermediate system for further processing.

Load:

The load phase moves the transformed data into the permanent, target database. It loads data into data warehouses or data repositories other reporting applications as well.

Once loaded, the ETL process is complete, although many companies perform ETL regularly in order to keep the data warehouse up to date.

For more info on the process its individual steps, check my previous blog post on this.

 

 

What are the Tools?

Leading data integration vendors must manage small and big, unstructured and structured data, batch and real-time streaming, on-premises and cloud or hybrid deployments, and deliver trusted data in a self-service fashion to everyone from business analysts to citizen integrators. All built on a unified metadata management foundation. Here is five of the best ETL Tools with some of their features and use cases:

 

Informatica Powercenter

Informatica PowerCenter is an enterprise data integration platform working as a unit, and it performs well in terms of features, capabilities, job opportunities, future as well as career growth.

With its high availability as well as being fully scalable and high-performing, PowerCenter provides the foundation for all major data integration projects and initiatives throughout the enterprise.

Informatica PowerCenter enables access to almost any data sources from one platform. It is because it uses  technologies like Informatica PowerExchange and PowerCenter Options.

IBM InfoSphere

Information Server is a line of products from the IBM company responsible for data warehousing and data integration.

IBM introduced Infosphere Information Server as a complete set of business intelligence and data warehousing products which operate in four functional areas: Source data profiling, data quality assurance, transformation and delivery.

 

Microsoft SQL Server Integration Services

It is a part of the Microsoft’s database product – Microsoft SQL Server. It has great features like the connection monitoring and the tasks managing component.

Integration Services is a great tool not only for companies keeping data warehouses, but also for administrators of small databases. List of its features is impressing, but there are some things that should be changed.

 

Amazon Web Services ETL

AWS is a cloud-based computing service offering from Amazon. AWS offers over 90 services and products on its platform from storage to game development. As part of their services, Amazon offers ETL services and tools. AWS Glue is a managed ETL service and AWS Data Pipeline is an automated ETL service. Whereas AWS Elastic MapReduce (EMR) and Amazon Athena/Redshift Spectrum are data offerings that assist in the ETL process.

 

BusinessObjects Data Services

It is a complex platform for the data integration process and constitutes the latest version of Business Objects application. In comparison with previous versions of BO ETL (BusinessObjects Data Integrator), a Data Quality module is the integral part of Data Services. The SAP BusinessObjects Data Services platform has a modular structure and consists of a Data Services Designer that offers a number of pre-defined transformations and functional objects that allow modeling of the ETL flows.A Management Console and Central Respiratory.

 

For a bigger list, check this webpage.

 

What is the relevance of ETL Tools  today?

When creating a data warehouse, it is common for data from disparate sources to be brought together in one place so that it can be analyzed for patterns and insights. It would be great if data from all these sources had a compatible schema from the outset, but this is rare. ETL takes data that is heterogeneous and makes it homogeneous. Without ETL it would be impossible to analyze heterogeneous data in an automated method and derive business intelligence from it.

 

Because of ETL Tools, analysts, business users, and data scientists could interact and query the data without disturbing business operations. More importantly, most ETL tools

eliminated the need to code: They provide a drag, drop, click, and configure

interface which breaks the transformation steps into distinct “stages”. ETL

specialists could build step-wise routines that were easy to trace and debug

without the need to understand code written by engineers.

 

Closing Thoughts

Building a data warehouse is a major undertaking that’s expected to yield substantial business benefits in order to justify the cost and effort. Using the right ETL Tool(s) is essential for success and for keeping up with the market and what it is demanding.

We have seen the definition of ETL, the process with its three stages and the some of the numerous Tools available in the market when comparing their usage. After that, we saw different reasons that make ETL Tools relevant for many users from different fields.

 

 

References

ETL Software tools

https://www.etltools.net/

ETL Process

http://datawarehouse4u.info/ETL-process.html

What is Extract, Transform, Load (ETL)?

https://www.informatica.com/services-and-training/glossary-of-terms/extract-transform-load-definition.html#fbid=RjHRT3238g7

Overview of ETL

https://docs.oracle.com/cd/B19306_01/server.102/b14223/ettover.htm

ETL Tools

http://datawarehouse4u.info/ETL-tools.html

Top 3 of the Best ETL tools in the Market in 2017

Top 3 of the Best ETL tools in the Market in 2017

What is ETL Testing and How it Works

Source:
360logica

What is ETL Testing and How it Works

In this post we define ETL and discuss what does it mean exactly. We will then define ETL testing, the process, how it works, and why it’s important. Moreover, we will discuss the relevance of ETL today, common users and use cases, and pros and cons. Last, we will provide some best practices to be followed for optimal ETL testing.

 

What is ETL?

ETL is a process in data warehousing that is responsible for pulling data out of the source systems and placing it into a data warehouse. This process includes the performance of three tasks: Extracting the data, transforming the data, and then loading the data. Data is extracted from an OLTP database, transformed to match the data warehouse schema and loaded into the data warehouse database.  Many data warehouses also incorporate data from non-OLTP systems such as text files, legacy systems and spreadsheets.

For more info on ETL, check my other blog post regarding this

ETL Testing

ETL testing is done to ensure that the data that has been loaded from a source to the destination after business transformation is accurate. It also involves the verification of data at various middle stages that are being used between source and destination.

 

ETL testing is performed in five stages. First one is identifying the data sources and requirements for the system in use.Second is performing an efficient and timely data acquisition. Followed by implementing business logics and dimensional Modelling. Then, building and populating general and targeted data. Finally, generating and analyzing reports that reflect the performance.

 

ETL testing is about discovering problems in the data stream and correcting them. In the past, this was done in a waterfall approach by identifying problems in the data or the ETL process, building a system to resolve those problems, testing that everything works, and rolling into production. But it is expected that the future of this field will be more of an agile process in which data issues are fixed on the fly, and largely automatically, with no interruption to data ingestion.

 

The general methodology of ETL testing is usually to use SQL scripting, which can time-consuming, error-prone and seldom provide complete test coverage. To accelerate, improve coverage, reduce costs in production and development environments, ETL can be automated completely using various tools.

 

Why is it relevant?

Data Warehouse testing assures that information is not just loaded correctly, but that it is appropriately aggregated, catalogued and verified so that it is useful and accurate for analysis and decision-making. It is worth noting that the ETL process is particularly vulnerable in this regard.

 

An example of its necessity with regards to performance and scalability related issues, a hypothetical successful client who opens new branches in other cities and has not tested the scalability of their data warehouse. As queries to the BI system increase, the system bogs down and the response time slows down. This will cause customers to contact a competitor who is able to handle the increased business.

 

While it seems obvious it is also worth noting that testing should go hand-in-hand with any application changes or new releases. This is regression testing, in fact, automated regression testing is typically performed weekly or even daily, depending upon users’ or the system’s requirements.

 

For more info, check this detailed webpage.

 

Common Challenges

ETL Testing is different from application testing because of the fact that it operates on a data centric testing approach. Many challenges are faced sometimes when dealing with ETL, some of which are: ETL Testing involves comparing large volumes of data, that is usually in the millions of records, the data that needs to be tested is in heterogeneous data sources (eg. databases, flat files), data is often transformed which might require complex SQL queries for comparing the data, ETL testing is very much dependent on the availability of test data with different test scenarios.

 

Best Practices

Many users are capable of increasing the efficiency of their tools notably by simply getting to know how to use it well. First advice always is to research, ask, question and wonder around, there are many resources with different features for different purposes.

 

Achieving proficiency in using ETL Testing can be achieved generally by working on multiple points, the user, for example, should make sure that the data is transformed correctly without any data loss and that the truncation projected data is loaded into the data warehouse. The user should also ensure that the ETL application appropriately rejects and replaces with default values and reports any dysfunctions or errors. You need to also ensure that the data loaded in the data warehouse is within the prescribed and expected time frames to confirm scalability and performance. All the methods should have appropriate unit tests,that should use appropriate coverage techniques to measure their effectiveness. Finally, the user should create unit tests that target outliers, exceptions and unusual cases that might be a sign of a a specific error or malfunction.

 

Closing Thoughts

Building a data warehouse is a major undertaking that’s expected to yield substantial business benefits in order to justify the cost and effort. To ensure that your ETL data warehouse project lives up to expectations, ETL testing must be front and center, particularly in the all-important early stages of the project being worked on.

 

We have seen the definition of ETL and ETL Testing. After that, we saw different reasons that make relevant for lots of users. Later on, we elaborated on some best practices and a checklist to follow when working on an ETL tool, finishing with those closing thoughts.

 

Reference

ETL Testing the Future is Here

https://www.alooma.com/blog/etl-testing-the-future-is-here

ETL Testing or Data Warehouse testing tutorial

https://www.guru99.com/utlimate-guide-etl-datawarehouse-testing.html

Basics of ETL Testing

http://www.datagaps.com/concepts/etl-testing

Why is ETL Testing so Important?

https://www.coherentsolutions.com/blog/why-is-etl-testing-so-important/

Bug Reporting, Error Handling, Error Monitoring – Definition and Best Practices

Bug Reporting, Error Handling, Error Monitoring – Definition and Best Practices

Overview

With the advancement of coding and its use in everyone’s life, and with the abundance of its presence in the world comes the fact that it is written by humans, mostly at least, at the end of the day. This logically means that error will occur and won’t be foreseen in many cases. Some errors will be detected quickly and some will need intense processing to be spotted or realized.

 

In this article, we will discuss and explain what bug reporting, error handling and error monitoring mean. We will also see some of their use cases and examples of each, before viewing the importance of tracking bugs and errors. Then, we will see the similarities and differences between these concepts and some of the best practices while dealing with them.

 

What do these terms mean?

Bug reporting

Bug reporting is an important aspect of software testing. An effective bug report communicates well with the development team and avoids confusion or miscommunication.Defect writing and reporting is one of the most important areas in testing life cycle and is one of the most neglected areas.

 

Error Handling

Error handling refers to the anticipation, detection, and resolution of programming, application, and communications errors. Specialized programs, called error handlers, are available for some applications. They should forestall errors if possible, recover from them when they occur without terminating the application, or, if all else fails, gracefully terminate an affected application and save the error information to a log file.

 

Error Monitoring

When errors happen in a production environment, they can be detrimental to application performance. Errors could be exceptions not handled by the code in the context of a business transaction, caught exception in a database call, and HTTP 404 errors.

Errors are tracked with data about how many of them occurred in a period of time, the frequency with which they occur , and error rate of the total process.

 

Use Cases & Examples

Bug Reporting can be done by any individual using any sort of platform, and it would always help the developer, when used properly.

This page explains in detail the methods to follow when writing an effective Bug Report.

Error Handling is a very essential stage in every program’s lifecycle, it is handled more on the developers’ level than the direct user, and is managed by powerful software.

Error Monitoring is also a production-level stage, but can extend, and should, to the execution level where the user is interacting with the code. It can be monitored using various methods, one of which is this effective tool by Bugsnag or Rollbar.

 

Why is it important to track bugs and errors, and what can go wrong if you don’t?

As long as humans will be writing codes, there will be error by nature. This is not the problem, as this is a natural process that every program or platform will go under to develop and advance. But problems occur when those errors go unreported and unnoticed. Bugs and errors need to be identified, tracked, monitored, analyzed and fixed at later stages, and the more accurate and useful data a company has, the more control they have over the process of enhancing their product.

 

What are the similarities and differences between them?

An error is a mistake, misconception, or misunderstanding on the part of a software developer. In the category of developer we include software engineers, programmers, analysts, and testers. For example, a developer may misunderstand a design notation, or a programmer might type a variable name incorrectly – leads to an error. It is the one which is generated because of wrong login, loop or due to syntax. Error normally arises in software; it leads to change the functionality of the program.

 

On the other hand, a bug is the result of a coding error. An error found in the development environment before the product is shipped to the customer. A programming error that causes a program to work poorly, produce incorrect results or crash. An error in software or hardware that causes a program to malfunction. Bug is terminology of Tester.

 

For more info, check this informative page.

Best practices for handling bugs and errors

Bug reporting is an important aspect of software testing. A good bug report should be clear and concise without missing key points. Any lack of clarity leads to misunderstanding and slows down the development process. Defect writing and reporting is one of the most important areas in testing life cycle and is one of the most neglected areas. Reports should include all, or as much as possible, of those things:

Bug ID, Bug Title, priority, platform, description, steps to reproduce, expected result (or actual if known), screenshot (if it has a visual effect).

Errors on the other hand should be dealt with the same level of professionalism, as they need to be under supervision and filled with a near live-time into a log that would help the developer identify patterns and future converges, and this can be only achieved when using a powerful tool that would track the errors with the greatest details, supplying the environment with valuable data that would place the coder ahead of the game.

 

Closing Thoughts

We have seen the definition of bug reporting, error handling and error monitoring. After that, we saw different use cases and examples of use for the three of them. Later on, I explained why is it important to perform some of those tasks, and the problematic complications that would occur if not enough attention was paid.

Furthermore, we have seen the differences and the similarities between errors and bugs and how do they differ by definition. We wen then over the importance of following appropriate practices while dealing with bugs and errors, finished with brief closing thoughts.

 

References

How to Write a Good Bug Report? Tips and Tricks

How to Write A Good Bug Report? Tips and Tricks

What is Error Handling

http://searchsoftwarequality.techtarget.com/definition/error-handling

Difference between Defect, Error, Bug, Failure and Fault!

Difference between Defect, Error, Bug, Failure and Fault!

Error Monitoring

https://www.appdynamics.com/opscentral/error-monitoring/

What is AWS EC2?

What is AWS EC2?

What is AWS EC2?

Amazon Elastic Compute Cloud (Amazon EC2) is an Amazon platform that provides  secure, resizable compute capacity, all in the cloud. It claims to make web-scale cloud computing easier and more accessible for developers. Its interface is designed for the user to be able to obtain and configure capacity with minimal friction (see the official documentation).

 

This post will be presenting the definition of AWS EC2 with some of its characteristics and reviewing some of its common use cases and tools used to access it. We will then review some concerns and possible security issues. Finally, we’ll discuss some best practices to serve as guidance for working with EC2.

 

Use Cases & Related Tools

This platform of Amazon cloud has lots of uses, with lots of purchase options and API, instances and networking. It offers flexibility and choices in the choice of the processor, memory, storage options, accelerated graphics and performance.

 

Using AWS’s high-performance-computing capabilities, Autodesk can scale the use of generative design to run hundreds of simultaneous simulations with complex parameters. Autodesk develops software for the engineering, design, and entertainment industries.

 

An example of its use can be also seen in this web application use. MediaWiki installed on Apache with 140 pages of content. They received efficient and timely memory, disk, network and CPU stats.

 

Fore more info, check the Amazon’s official description.

 

Security & Performance

Security

Cloud security at AWS was a priority for the company, as their job requires high security standards due to the sensitivity some of the data the platform might be handling. An AWS customer can benefit from a data center and network architecture built to meet the requirements of the most security-sensitive organizations. Amazon EC2 works in conjunction with Amazon VPC to provide security and robust networking functionality for the user’s compute resources.

Reliability

Amazon EC2 offers also a highly reliable environment where replacement instances can be rapidly and predictably commissioned. The service runs within Amazon’s proven network infrastructure and data centers. The company has shown a history of strict and secure systems throughout its path. However, liability should never be assumed to be on the other party, and best practices should always be followed to avoid any obstacles or unexpected errors.

Performance

It provides the user with complete control of the computing resources and allows them to run on Amazon’s computing environment. Amazon EC2 reduces the time required to obtain and boot new server instances to minutes, allowing the user to quickly scale capacity, both up and down, as the computing requirements change. Amazon EC2 changes the economics of computing by allowing the user to pay only for capacity that they actually use, instead of leaving partially filled spaces of storage on the cloud. It also provides developers the tools to build failure resilient applications and isolate them from common failure scenarios.

 

Best Practices

Amazon EC2 (Elastic Compute Cloud) provides scalable computing capacity in the cloud. When using this service, it is highly recommended to monitor it for intrusion attempts or other unauthorized actions performed against your cloud infrastructure.

 

Trying to leverage traditional, non-cloud native solutions in order to backup AWS resources may be costly and ineffective. Traditional backup software and methods are very centralized by nature, holding disadvantages such as creating single points of failure as well as the high cost of software licenses and required dedicated hardware resources.

 

You can also follow this brief checklist:

Security and Network

Managing and monitoring the access to the AWS resources, APIs and storage platforms using identity federation, IAM users, and IAM roles.

Establishing an automated system for managing the process of distributing the credentials. The user should implement the least permissive rules for their security and privacy.

Regularly patch, update, and secure the operating system and applications on the instance used.

Storage

Understand the implications of the root device type for data persistence, backup, and recovery.

Use the instance store available for your instance to store temporary data.

 

For more info, check the official guidelines.

Closing Thoughts

In conclusion, AWS EC2 simple and practical when it comes to scale of the various and multi-layered options it paves the way for. Using traditional forms of hosting seems too old to use with the existence of this powerful cloud. This tool represents a solution for many companies that need this availability, speed, security, and flexibility in the platform they are using.

 

References

Amazon EC2

https://aws.amazon.com/ec2/?sc_channel=PS&sc_campaign=acquisition_US&sc_publisher=google&sc_medium=ec2_b&sc_content=ec2_e&sc_detail=amazon%20ec2&sc_category=ec2&sc_segment=175055296277&sc_matchtype=e&sc_country=US&s_kwcid=AL!4422!3!175055296277!e!!g!!amazon%20ec2&ef_id=WnPKRgAAAFPlIVHa:20180216220855:s

Choosing the Right EC2 instance and Applicable use case

How to Automate your EC2 instance backup step by step

How to Automate EC2 Instance Backup Step-by-Step

AWS Customer Success

https://aws.amazon.com/solutions/case-studies/

EC2 Use Cases

https://documentation.wazuh.com/3.x/amazon/use-cases/ec2.html

Marketing Tools for Developers

Marketing Tools for Developers

Overview

Sales and Marketing have changed in form and structure so much in the last 100 years, probably the biggest change in its history. Companies are using advanced technologies nowadays by making the process increasingly automated, and focusing more on the customers themselves, and personalizing their purchasing experience as much as possible.

In this article, we will discuss various marketing tools and technologies, for developers. Moreover, we will see why developers need those tools.

Finally, we will go over some best practices to get familiar with marketing tools while avoiding the need to go through the internet’s endlessness. To get started, here is a wiki by Clearbit which maps out various types of marketing tools and includes definitions and resources about each of them.

What are Marketing Tools and Technologies?

There are countless marketing tools in the internet, available for various purposes and different users.

I will present some of those tools, put into categories that are divided based on the purpose of the tool.

Idea Generators: Some of the good tools are Xmind, Scapple, Mindmup and Google Trends.

Ad Networks: iAd for Apple, Chartboost for mobile games, AdColony for Video Ads, are some of the famous tools for Ads.s

Market Research: Google Trends, Google Keywords Tools and Priori Data for Mobile Apps.

InApp Analytics: Google Analytics, Apple App Analytics, and Flurry Analytics (free one).

Push Notifications: Puch.io, Xtify, and Appscend are some of the most used.

And much more, for different purposes and/or more customization.

Do Developers Need Marketing Tools? Why?

Nowadays, the internet is full of information about marketing strategies, methods and tools that can help your company, and most companies have a good toolkit for dealing with that. However, even the best marketing experts in the world need an advice or a warning every while and another. There is an increasing push towards consumer attention. In fact, there are few main fields that modern tools can help you with.

Goals
Most companies that have weak marketing strategies fail to define a clear goal for those strategies. This can stem from many things, like not using enough resources in the plan, or the lack of understanding the true demand of the customers and what do they want from the manufacturer. Moreover, not tracking and monitoring the progress of the campaign, for example, would definitely mean that the company would have less control on its market and product.

Improving Strategies

Marketing tools can help companies not just with the overall big picture, but also dive deep into specific parts of it, to the details. There are many specific priorities for many businesses, conversion optimization for example, that can be tackled using different Tools that can increase the conversion rate with touchable results, by analyzing the existing structure, of a website for example, and make suggestions for improvements accordingly.

 

Ease of Work

After all, a company can decide to use marketing tools fully, and apply a full marketing automation. This would include, analytics, tracking, web content personalization, campaign management, and more.

Generally, companies that use marketing automation to generate leads have higher conversion rates that those who don’t. Because they offer a personalized support that would assist the company in reaching their milestones.

How to get acquainted with Marketing Technologies without getting lost in all of the Information online

There are different answers for this question, especially with regards to the person asking. But generally, people can acquire lots of info and skills from attending courses that touch on various fields within this sphere.

There are several courses you can attend, they would give a detailed content with various useful tips and a professional guidance, the one you should sign for at least!

There are many online courses for marketing for developers, this is a good example of one of them.

On the other hand, scanning the internet for useful sources isn’t always futile. There are many resources that can be used to further one’s knowledge about the topic, acquire guidance for which tools or/and methods to use, and much more.

Two examples of useful resources: The Developer Marketing Guide and this list by apptamin.

Closing Thoughts

After we have seen how important marketing tools are for developers and any successful company, we have seen the reasons that present the need for and the importance of the new Tools available in the stock in today’s Market, as analyzing and responding to customers reviews and opinions is a fundamental process in Sales. I discussed the meaning of the term, its relevance, and importance to people, especially marketing managers, and presented several common platforms for different use cases. Moreover, we saw the way into knowing more about marketing tools by following best practices and using the right resources.

Whether a company is trying to enhance its performance and reach more customers and increase its knowledge of its long lists of past customers, or if it is a startup that is initiating its very first campaign to the Market, both companies need marketing tools to succeed. This is due to the importance of reaching out to new customers and personalizing their experiences.

References

The Developer Marketing Guide

https://www.devmarketingguide.com/

The Top 60+ App Marketing Tools for Developers

https://www.linkedin.com/pulse/top-60-app-marketing-tools-developers-jemmy-patel/

App Marketing Tools for Developers

App Marketing Tools For Developers

Why Companies Need Marketing Tools in 2017

 

What is Docker Swarm?

What is Docker Swarm?

Overview

In this article, we’ll review Docker Swarm, its definition and functions in today’s world, and why it has become a popular and significant Docker feature.We’ll also review a few user types, general use cases, and look at some security issues and best practices for using Docker Swarm.

 

Definition

Docker swarm mode allows you to manage a cluster of Docker Engines, natively within the Docker platform. CLI platform can be used to manage the behavior of, create and deploy application services to a swarm.

 

Terminology

Swarm: a swarm is made out of various Docker hosts, that all run in Swarm mode and behave as a manager and worker nodes.

Swarmkit: a project that implements Docker’s orchestration layer and is used within Docker to execute Docker Swarm mode.

Node: a swarm mode is an individual Docker Engine participating in the swarm, where they can run simultaneously, or on separate machines.

Task: a Task carries a Docker container and the commands to run inside the container.

 

The Function

Swarm can help ITs and programmers to maintain the lifecycle of individual containers and perform security checks to the system, monitor and change the number of containers according to the load, coordinate between the containers and the allocate tasks to groups of containers, supply counter failure redundancy if the nodes fail, and execute periodic checks for software across the containers.

 

Why is it important?

We can see easily from its definition and reading through its various functions that Docker Swarm is quite important for any IT department that is dealing with containers and maintaining their security. One of the essential things is portability. Containers can run on top of virtual machines or servers. Containers can be stored on on-premise platforms or on the cloud. Coders can write a program, put it in a container, and move it in different environments, while maintaining the content, as it is encapsulated within the container intact.

 

Who uses it?

Docker Swarm is used by various users and for different purposes. It can be used in production, by ITs, programmers and software engineers. The use doesn’t depend really on the scale of the work, as this platform is quite scalable. Docker is used by many Tech companies and many service providers are enabling it on their platforms.

 

Use Cases

For Decentralized systems

The system doesn’t handle differentiation between node roles at deployment, the Docker Engine handles any kind at runtime instead.

Cluster Management

Using the CLI to create a swarm of Docker Engines where the user can deploy application services, with the integration of the cluster management with the Docker Engine.

Scaling

The user can choose the number of tasks they want to run for each service, with the ability to change the number in real time

And of course many more uses.

 

For more details, check this website.

 

Security Issues

The system provides an adequate security system, yet there is a concern about the security of the containers. Major concerns are the intrinsic security of the kernel, and the strict security measures of the kernel and how they interact with containers.

 

Some people say that if secure and best practices are used, Docker containers can be equally safe with virtual machine, and that most of the errors that occurred were human ones, not caused by a flaw or a hole in the system itself.

 

Docker Trusted Registry (DTR) is the enterprise-grade image storage solution from Docker. It is installed behind a firewall so that Docker images can be securely stored and managed.

 

For more info, check this page.

 

Best Practices

Managing Secrets in the container image exposes it and makes it vulnerable to misuse and jeopardy. The user needs to provide the container with access to the secrets that it will need as it is running, and prior to that.

 

Use trusted images to set up a trusted registry of base images, which are the only image developers would be allowed to use. Users should use both educational and enforcement controls to prevent the use of untrusted images, which might be malicious ones that might damage the environment.

 

Secure your runtime environment by applying and namespace and cgroups permissions to isolate access, and control the sphere of influence of each process. Containers can connect to each other inside the same host and across clusters, making their communication invisible to traditional firewalls and networking tools, and limiting the ability to understand and control traffic at a granular level. Therefore, use container-level Nano-segmentation to limit the potential ‘blast radius’ in case a container tries to do something it shouldn’t.

 

Vulnerability scanning by avoiding letting images with known malfunctions from running in the production environment by using tools that perform periodic security checks. If an active mood was set, establishing and managing security policies across the whole container lifecycle, this can make the containerized environment very secure.

 

Closing Thoughts

We have seen a brief explanation of the Docker Swarm and its definition. We also went through its functions and how can DS help ITs and programmers. We also talked about why is it important and who uses it, with viewing some of its use cases. On the other hand, we viewed some of the security concerns and presented several best practices.

All in all, we can say that Docker Swarm is a pretty useful tool, once used carefully.

 

References

Docker Swarm 101

https://www.aquasec.com/wiki/display/containers/Docker+Swarm+101

Docker Security Best Practices

https://blog.aquasec.com/docker-security-best-practices

Swarm Mode Key Concepts

https://docs.docker.com/engine/swarm/key-concepts

Docker 101

https://www.networkworld.com/article/2361465/cloud-computing/docker-101-what-it-is-and-why-it-s-important.html