#Enterprise #Architect with/and #Agile

An enterprise architecture is a disciplined approach envision to take business from current stage to next stage. It boils down to making sure an organisation’s systems harmoniously work together to meet the business objectives and different views. It is a process of developing capabilities, defining principles and frameworks which helps during implementation.
It is often observed, architecture has slowed down the process of development teams, because they carry bigger picture and had a long-term vision. On the other hand, engineering team have evolved, and they’ve evolve a lot more quickly than architectures. This is because challenges and lessons learned from past deliveries have forced them to evolve and act on business changing needs. I’ll skip other software development cycles for the sake to keep this short. In recent years, agile has improvised the way the software has been delivered – by keeping it simple to deliver what is asked to, deliver in chunks rather than big-bang, deliver what it is required against what was required, easily mitigate the risks, early assessment, dealing with changing requirements. This has helped development to pace up with business, yet with quality delivery.
It is necessary to understand that the starting point when architecture starts to meet business expectations, may have changed at the time they finished. The similar approach to deliver has to be taken by architecture. Today’s architecture cannot deliver business vision based on discussion held few years back. EA should visualise how the process of delivering the facilities to engineer team should streamline and address the ever changing needs and re-evaluate the decisions as requirements change. Although, we are talking about the vision and delivery of vision, the key aspect is how EA can improve their deliverables and keep with the pace of business needs, w/o compromising their goals, principles, guidelines, and meeting common issues. It’s time to transition ourselves naturally and have more engagement with the projects. This will bring more agility in the process.
It seems Agile methodology of delivery projects can help EA to assess the key goals and drivers from time to time. It is tricky how the integration will help so that EA should not over-engineer and delivering what business is looking for and changing their approach based on change requirements. It is important to understand how Agile team will cope up such deliverables when it is not a monolithic project delivery. It’s time to look beyond the horizon and accept the key challenge of ever changing and define the strategy to mitigate this risk. This could reassess the designs from time to time, follow best practices, introduce iterations within the architecture delivery phases, bringing more agility, introduce or co-work with agile model driven development focusing on model, vision, and feedback.

From #developer to #architect – Step 1

So, you’ve decided to be an architect? What makes you think to get into this stream – love of technology, passion to build solutions, you hate your architect and you can do well more than him/her, etc. Anyway, since now you have decided (or reading for some sake) let me, at least put some basic information, what you should know when you are aspired to be an arch. I’ll try to keep this as simple as it should be, consider this article as a baby step or first step.

Rule # 1: Inputs and deliverable – Very first you may have observed that architects often start their answer with “it depends”! Yes, that’s correct and it is! When you are developer, you know there’ll be only single approach to a problem – meaning to open a File Dialog to save file from your application, you’ll use single line syntax using your favourite programming language – FileDialog.Open(); An architect has to consider various factors and inputs before coming up with answer, hence “it depends!”. Deliverables are what, as an architect, you will deliver. Unlike coder, where you deliver code (module, database, set of classes, etc.), the deliverables are often documents and artifacts. Depending on audience, documents and artifacts delivery will vary.

Rule # 2: Fundamentals – Strengthen your fundamentals. By now, you’ve invested your time and efforts in development. Round the clock, burned your weekends, gained calories, compromised your health (and hair). Being an engineer, you should be aware how things work! By things, I mean – anything what you are familiar with – language, database, programming, etc. This will help you to justify what to use, when.

Rule # 3: Keep learning – by that I mean, be learner. Don’t refrain or limit yourself to specific technology or concepts. This will help to widen your thoughts and help you to choose when to use, what! 

Rule # 4: Good listener – Developers are impatient. They generally get into coding without much thinking solution’s cons. I would suggest to be good listener and then react. Start practicing drawings on paper. This will help in visualising what is being said.

Rule # 5: Terminology – often architect speak business language which is familiar to business. Yes, you should start thinking from business view like how your earlier work has helped to the business and try to visualize it in a bigger picture.

Rule # 6: Visualize – Start visualizing the end product/software/business solution well before it has been realized. Be analytical, how particular business requirement will solve through software.

Rule # 7: Communication – Most importantly, be communicative. Unlike in development zone, this soft skill will help you to grow from all dimensions. As an architect you will be dealing with different stakeholders – project manager, business, analysts, developers, operations, etc. hence, you should be communicative and that too, precisely and to the point.

Rule # 8: Some managerial skills – Get the work done from your peers, understand the strength and weakness of your team and based on that react and estimate the time required to develop your architecture. Be a good planner, good estimator.

Rule # 9: Patterns and practice – know the patterns that are solution to common problems. This will help not only in reducing the efforts but also improves quality architect delivery

Rule # 10: Books – these are man’s best friend. Keep reading, understand the concepts, patterns, solutions, common practices, guide, common terminologies. This not only will help you to grow, but also widen your boundary of thinking.

Mind map for TOGAF v9.1

Few months back, I appeared for TOGAF v9 exam and successfully cleared it. Although it was tough by itself, but my experience with earlier projects helped me to understand its concept. For those who haven’t been fortunate, I would suggest to look for and take help from mentor. This really helps. Many of us, aligned it with Solution Architect role or even try to align it with technical experience, although, to some extent, it is true, but as a whole, these are just the piece of big puzzle. 
In short, TOGAF is about applying its ADM development method on a given state of requirement leading it to next stage, and at the same time – build enterprise’s capability, and maturity. I believe this can be applied on any business requirement or problem, where it involves the progression from one (current) state to another (advanced/more mature) state.
Although during my exam study, I did took help from different sources including self-study kit from The Open Group. Some of noted I created my own during the course. Mindmap, is one of the note I keep referring, as and when I need some help for a quick glance or help.
It was created using FreeMind, and I would like to share it for an aspiration, inspirational and reference study.
Hope it helps you.

Download from here: http://1drv.ms/1WvYO4N

Cloud for Researches

Cloud has attracted alot of attention be it – organisations migrating their legacy solutions on cloud or for new development to serve their different devices/platforms or for research their new ideas. It is more than a buzz word. Cloud is now more than its basic avatar – computing, networking, or storage. The demands from business has grown by leaps and to meet this pace, cloud platform is evolving and stretching its arms to embrace more than basic necessities.

Despite of cloud’s popularity, it has been unclear whether it will be able serve the research field. Research is always expensive. It requires quality expertise and high experience and patience to examine different factors under controlled environment. Objective based research are heavily funded. Heavy historical datasets are often parsed to perform heavy processing requiring enough computing power to help researchers to manipulate the data and explore it with combinations. To wrangle and dismantle data, expensive infrastructures are required running continuously, are often replaced with compromised or cheaper resources to save cost and money.

Cloud computing provide and hold promise of availability and scaling. Vendors like Microsoft Azure and its design, should help researchers to analyse and delegate mundane tasks to cloud. The applicability of – industry-wide, proprietary-based, and scientifically-proven – intensive algorithms can be executed on map-reduce and achieve high prediction using machine-learning.

Heavy lifting and shifting of data, integration with different data sources are some of the tasks, which are easily plugged within cloud using either out-of-box or custom adapters. It is necessary now for researchers, they should start thinking cloud as a platform and exploit its benefits to buy leased-based-resources. This will help them to focus more on their researches rather than worrying on high computing systems and servers.

Microsoft, and other cloud providers, continue to invest heavily to achieve high security and become mature with every cloud-upgrades. Adapting to the industry level mandatory compliance, will not only make sure they are concern about data protection and algorithms, but also will build trust.

#Arduino #IoT Project 5 | Detect motion using PIR sensor and blink LED

Project – Detect motion using PIR sensor and blink LED

Ouput – http://youtu.be/Qkiwpjyv0BU, http://youtu.be/CtG75k7dzt8


int ledPin = 13;
int inputPin = 2;
int pirState = LOW;
int val = 0;

void setup() {
pinMode(ledPin, OUTPUT);
pinMode(inputPin, INPUT);


void loop(){
val = digitalRead(inputPin); // read input value
if (val == HIGH) { // check if the input is HIGH
digitalWrite(ledPin, HIGH); // turn LED ON
if (pirState == LOW) {
Serial.println(“Motion detected!”);

// We only want to print on the output change, not state
pirState = HIGH;
} else {
digitalWrite(ledPin, LOW); // turn LED OFF
if (pirState == HIGH){
Serial.println(“Motion ended!”);
pirState = LOW;

Data Scientist | Relational Algebraic significance in Data Science

We have been using relational database for decades now. The normalisation of database is the key. It resides on 3 base components – Structures, Constraints and Operations. Although, this is one of the format source of data, which data scientist will encounter, it will also be necessary for data scientist to store values. Could be used to store processed data for data visualisation or perform further manipulations.

As a data scientist, it is necessary to understand and underpin that programs that manipulate tabular data exhibit an algebraic structure allowing reasoning and manipulation. There is algebraic of tables which performs operations on tables like – Select, Update, Insert and Delete along with other projection through columns using join with other tables. This may include aggregate, union, difference, cross product and many others.

Hence, it is necessary to understand relational algebra with relational database, such that, it will help to a data scientist, to do optimizations and legitimate the dealings with data while performing various data operations. As an example, using the algebra laws of arithmetic operations – division, multiplication, addition and subtraction, it will reduce an overhead to execute query and deliver in quality response time. Often data scientist neglect the simple laws, which may impact the result sets. Like, in this equation, p = ((z*5) + (z*8) + a) / b, the variables denoted alphabetically a, z, and b will be replaced with value during evaluation. If a is always 0 and b value is 1, there will be additional algebraic operations being evaluated. Remember we are talking on large gigs of datasets, not fewer mbs.

As a data scientist, you should always think about numbers instead of tables and columns. In earlier expression, if you evaluate if z is 2. So, adding 0 to any number or dividing any number by 1, will not make any difference to the output. Hence, it is necessary to break the expression, validate and re-write the expression which may perform less evaluations and cycles. This is symbolic reasoning, and it is necessary to understand, computers follows the arithmetic instruction on a given expression. It will not suggest or perform any kind of symbolic reasoning. Hence, it is quite expensive when objects that you will manipulate is not mere integers, rather you may be dealing with terabytes sized tables, than this kind of symbolic reasoning will not work.

So, it is question of cost based optimization, as it is unnecessary that short version expression works all the time. Relational database engines parse SQL query into relational algebraic before execution, it is associate query execution cost depending upon volume and quality of query in where/group by/having clauses. It is always to perform various expressions on sampling data and choose the one with the lowest cost. Putting it note that performing query on a table(s), will always return a table, call it as result-set or dataset or dataview.

Hence, you must also consider logical expressions of associative (right, left or cross). Furthermore, these are often combined with logical operations like AND, or OR and negating conditions like NOT.

Distinguish | #datascientist vs #visualisation vs #businessintelligence #bi vs #machinelearning #ml vs #stastistics vs #dba

Distinguish with Skills Vs. Data Scientist Gap
Business Intelligence Business Intelligence is particular approach to a particular problem. BI engineer is not expected to consume their own data products, and perform their analysis, and make the business decisions themselves. Usually they build tools for others to make decisions with. Data Scientists does both. Learn how to do statistical modelling and communicate results with business group and decision makers.
Statistics Statistics are at heart of what a data scientist does, day-to-day. They are comfortable for any data they encounter with will be available at single place as the most information possible from a very sparse, expensive to collect and, work on a small data set. Data Scientist is a new engineering to work on massive datasets and perform their analysis on disparate data rather than do a sampling, or on assumption. However, model and methods remain same. Learn to deal with data that does not fit from a single source.
DBA Database programmers bring a lot of skills and expertise. However, looking the diverse data from different sources like graph nodes, and vectors, there is less chance of incoming relational data, which may or may not be right tool and even the concepts that transcend in particular system. Data scientists work on disparate and diversified datasets including non-relational and different formats other than text – audio, video, binary format, helping them to perform deep analytical and bring insights. Learn to deal with the unstructured data.
Visualization Like Statisticians, they are concerned with limited data to visualise from single source rather than multiples. Data Scientists conceptualize how the final data should be interactive which help business decision makers to decide based on available historical data. Learn about algorithms and tradeoffs at scale.
Machine Learning Machine Learners helps bring reality to the concept of performing predictions on the data. However, it is more like engineering process where technique, and applying it, and running it is a fairly small fraction. Data Scientists prepare model, methods, and techniques, how machine learning will help in their data application. There is a lot of work involved before machine learning stage – data manipulation, data wrangling, cleaning, massaging, data jujitsu, data mugging. Learn to do statistical analysis, data wrangling from multiple sources before performing machine learning process.

Get every new post delivered to your Inbox.

Join 123 other followers

%d bloggers like this: