Saturday 26 July 2014

How to make extra cash - new sources of secondary income?



Online surveys: There are many online surveys such as https://www.airmilesopinions.ca/, http://www.cashbackresearch.com/, http://www.univoxcommunity.com/ that are offering people some money in form of reward points or actual dollars for taking surveys. The surveys are normally on your shopping habits, car purchases, health related, food that you consume etc.
The problem with most of these surveys are that it is hard to qualify for every survey. Some of them are pretty long. I tried these for couple of months and finally gave up. Not worth my time. These companies sell the data to the sponsor of those survey or analyze it for their own purpose for mostly marketing purpose. Both the survey participant and the survey provider are making money but to me it seems more helping survey provider than the taker. There might be some who might have successfully made some money out of it. May be try and see for yourself.


Rewards and Loyalty programs: There are lot of loyalty programs starting from airmiles, aeroplan, petro point, td rewards etc that offer reward points for customers who are registered with their loyalty programs. In a long term it is possible to collect enough reward points from these programs if you registered in more than one program or you use one program to the maximum. For instance, airmiles has tie up with lot of sponsors such as esso gas, children's place, etc. Customer using this card at the locations can accumulate enough points though it is very slow process. Now what this has to do with data? Are these companies just making you be loyal to some companies. That is partially true. These loyalty companies are basically marketing companies that analyze the customer behavior, profile them, analyze the products and they sell it the sponsor companies.

Mobile reward apps: Have you heard of checkout51 and snapsaves app? If no, please check it out. If you shop a lot at grocery stores then these apps are meant for you.When you enter a store, just checkout all the products on offer in those apps. Buy any of those products that are offered by those apps and you get a dollar or 50 cents and sometimes 2$ -5$ discount on the products. You can take a photo of your receipt and send it using those apps and if approved you will be credited with the money promised to you. It is quick and you can collect some real money on it. Now this is data in your finger tip that you can use to make some cash. Isn't it?

Ghost/mystery shopper: is a tool used externally by market research companies or internally by companies themselves to measure quality of service, or compliance with regulation, or to gather specific information about products and services. Had tried http://www.lanla.com a while ago. This company pays you for evaluating quality of service for example a restaurant or a store.  The shoppers specific identity and purpose is generally not known by the establishment being evaluated. Mystery shoppers perform specific tasks such as purchasing a product, asking questions, registering complaints or behaving in a certain way, and then provide detailed reports or feedback about their experiences. So the shopper provides data to the agency that is performing this evaluation and both make money from it. Sound like a good idea for shopping freaks?

Build your website: If you are reasonable good with technology then it is easy these days to build websites. The website could be a blogging website that you can  create on www.blogger.com or www.wordpress.com and it is easy to get started in minutes or it could be a forum or some website where you sell stuff like ebay or kijiji. There are numerous templates available to build the website of your choice. Once you have your website, you can monetize it by selling ads on your website. Google adsense or clicksor kind of companies allow you to sell their ads on your website. Set up is easy and if you can build enough traffic to your website then you can make some extra cash from this endeavor.  

Sell online: www.kijiji.com or any craigslist website or well the famous www.ebay.com allow you to sell your good online. It could be used items or new once. ebay allows to set up online store where people from all over the world can buy your items. kijiji or craigslist are good for selling your used once and well also the new once.

Uber taxi: Have you heard of Uber car sharing program? If it is in your city, then use you uber car sharing program to provide ride to people in your city or neighborhood for a reasonable fee. Have read on news on how people are using Uber and making good money. If someone needs a ride to airport, then why not give them a ride, meet some people , and also make some extra cash.

Monday 21 July 2014

Troubleshooting: Common Informatica Error Messages and the resolution


Common Informatica Error Messages and the resolution:

1)
ERROR  7/21/2014 5:16:25 PM    READER_1_1_1 HIER_28056        XML Reader: Error [ExpectedCommentOrPI] occurred while parsing:[FATAL: Error at (file /data/pmrootfolder/test.xml, line 1, char 28236 ): Expected comment or processing instruction.]; line number [1]; column number [28236]
Database Error: Failed to connect to database using user [TESTDB] and connection string [TOREST].].
ERROR  7/16/2014 10:40:18 AM  READER_1_1_1 BLKR_16001        Error connecting to database...


RESOLUTION: Check connectivity to database. If not able to connect to database, please contact the dba to resolve the issue with database.

For XML writer issues check also:



2)
Error message for the failure :
ERROR : (8806 | TRANSF_1_1_1) : (IS | ETL_IS) : : pmsql_50065 : [ERROR] ODL error:
 FnName: Execute -- Communication link failure.


RESOLUTION: Check connectivity to integration server or database.





Error connecting to database [
[IBM][CLI Driver] SQL30081N  A communication error has been detected.  Communication protocol being used: "TCP/IP".  Communication API being used: "SOCKETS".  Location where the error was detected: "172.168.10.1".  Communication function detecting the error: "connect".  Protocol specific error code(s): "146", "*", "*".  SQLSTATE=08001
 


RESOLUTION: Check connectivity to database.



3)
FATAL   9/9/2013 9:31:28 AM      *********** FATAL ERROR : An unexpected condition occurred in file [/pmroot/pc9x_root/910HF/build/powrmart/server/cmnutils/soutstream.cpp] line [214].  Aborting the DTM process.  Contact Informatica Global Customer Support. *********** 

RESOLUTION: This is bug with informatica 9 version. Please recover the session and it should work properly.



4)
ERROR    9/13/2013 6:20:08 AM           DIRECTOR               REP_12400              Repository Error (
ORA-30032: the suspended (resumable) statement has timed out
ORA-01536: space quota exceeded for tablespace 'TEST_DATA'

RESOLUTION: Problem with oracle tablespace.Fix the table space issue.



5)
ERROR  4/5/2013 4:21:40 AM      WRITER_1_*_1 Net_1762            [ERROR] Connection timeout expired
ERROR  4/5/2013 4:21:40 AM      WRITER_1_*_1 Net_1762            [ERROR] Integration Service could not connect to Teradata Performance Server.

RESOLUTION: Check connectivity to database.If not able to connect to database, please contact the dba to resolve the issue with database.



6)
ERROR : (26061 | POST-SESS) : (IS | ETL_IS) : : CMN_1949 : Error: [Pre/Post Session Command] Process id 26217. The shell command failed with exit code 1.


RESOLUTION: Check the post session command and resolve the issue with this post session command.


7)
XML Reader: Error [XMLException_Fatal] occurred while parsing:[FATAL: Error at (file EMPTY, line 1, char 664 ): An exception occurred! Type:UnexpectedEOFException, Message:The end of input was not expected.]; line number [1]; column number [664]
HIER_28058    XML Reader Error
TRANSF_1_1_1    MXR_91003    Failed to process an XML document.

Resolution: I was reading an XML file for parsing and got the above error. The file was read as a string and somehow the string was getting truncated. Then I found the reason was that the column delimiter was comma (,) in the file properties for that input file on the session and comma was in the middle of the file I was reading and hence it was not reading the complete file. Fixed the issue by changing the file delimiter to something else like \050 and was able to read the complete file and parse it. 




8)
ERROR    9/8/2014 1:48:14 PM    my_informatica_server.com    MAPPING    CMN_1022    Database driver error...
CMN_1022 [select * from test_table
FnName: Execute Direct -- ERROR:  test_table does not exist]


Resolution: This is result of SQL error. In the above example test table does not exist. Create the table and try again. In general, resolve the SQL error.



9)
Error:
FATAL     *** FATAL ERROR : Failed to allocate memory. Out of virtual memory. *********** 
FATAL     *** FATAL ERROR : Aborting the DTM process due to memory allocation failure. *********** 

Resolution: There is not enough memory to complete the task. Either the swap space or disk space is full on the informatica server. Clear up some space. Recover the failed. Other things you can do is, reduce the cache size and auto memory settings and check if that helps.


10) ERROR READER_1_1_1 RR_4035 SQL Error [
FnName: Bind Col -- Invalid descriptor index
Database driver error...Delayed binding at fetch time failed.].

Resolution: Got this issue when the SQL in my session was different than the SQL in the mapping. The SQL override logic was not mapped properly to the ports in the mapping. Common issue in most of the mapping with SQL override. Always the fields in the SQL of SQ have to be mapped to the outgoing ports of the SQ and not to all of the ports in the SQL.


11)
[ERROR] Web Service invoker encountered an error while invoking the Web Service. Reason:
 List does not exist.
 The page you selected contains a list that does not exist.  It may have been deleted by another user.

Resolution: Got this error when connecting to sharepoint web service. Most likely the reason is you are not connecting properly to the list or the list does not exist on sharepoint and the sharepoint admin has to fix the issue. We had this issue since Informatica was not able to connect to the host server of the sharepoint site. We resolved it by moving to the sharepoint site that informatica could connect properly.


12)
ERROR    11/25/2014 9:36:55 PM    node91_host    TRANSF_1_1_1    JAVA PLUGIN_1762    [ERROR] java.lang.NullPointerException
ERROR    11/25/2014 9:36:55 PM    node91_host    TRANSF_1_1_1    JAVA PLUGIN_1762    [ERROR]     at com.informatica.powercenter.server.jtx.JTXPartitionDriverImplGen.execute(JTXPartitionDriverImplGen.java:419)
ERROR    11/25/2014 9:36:55 PM    node91_host    TRANSF_1_1_1    TM_6085    A fatal error occurred at  transformation [JAV_split_sku], and the session is terminating.

RESOLUTION:
The JAVA transformation is trying to assign null values to some variable or using some operation on a null variable. Filter the records  that have null values for those variables used in the java transformation.


13)
INFO : (7638 | READER_1_1_1) :  node01_DW : FR_3055 : Reading input filenames from the indirect file [/root/informatica/SrcFiles/myfilelist_.ind].
ERROR : (7638 | READER_1_1_1) : : node01_DW : FR_3000 : Error opening file [/root/informatica/SrcFiles/myfilelist_.ind].  Operating system error message [No such file or directory].
ERROR : (7638 | READER_1_1_1) :   : BLKR_16002 : ERROR: Initialization failed.

Resolution: The indirect file you are referring to does not exist. Create the file or check that path of the file.


14) ERROR      node91    TRANSF_1_1_1    JAVA PLUGIN_1762    [ERROR] Failed to bind column with index [0] to data type [BIG_DECIMAL]
ERROR   node91    TRANSF_1_1_1    JAVA PLUGIN_1762    [ERROR]     at com.informatica.powercenter.sdk.server.IBufferInit.bindColumnDataType(IBufferInit.java:96)
ERROR    node91   TRANSF_1_1_1    JAVA PLUGIN_1762    [ERROR]     at com.informatica.powercenter.server.jtx.JTXPartitionDriverImplGen.init(JTXPartitionDriverImplGen.java:74)

Resolution: This is caused of high precision settings on the java transformation and the session. Enable high precision on both the session and the java transformation.To enable high precision check out :




15) Error: RR_4035 SQL Error [
FnName: Execute -- ERROR: 191744 : Not enough memory for n-squared join
FnName: Execute -- 523 158].

Resolution: This was caused for me because I had two joins and the second join which was a master outer join I believe did not have enough memory.



16) Error: (95_IS 1/22/2015 12:11:12 PM) Abort Workflow: Request acknowledged
(95_IS 1/22/2015 12:11:12 PM) Abort Workflow: ERROR: Cannot stop or abort workflow [id = 1878] or a task within the workflow. The specified run id [3181132] is not found on this Integration Service.

Resolution: The workflow is in failed status so not able to abort it. Restart the workflow. If you want to recover the workflow, then let it suspend when it fails.


17) ERROR    1/18/2010 5:32:02 AM    nmynod_95    MAPPING    CMN_1022    Database driver error...CMN_1022 [update employeed set myflag='Y'
FnName: Execute Direct -- ERROR:  Concurrent update or delete of same row [tbl 669019 dsid 40 tx 0x89ec1c U prev 0x89ebc2]

Resolution: Two sessions trying to update or insert records to the same table. Let one of the session complete and then recover this session.



18) Error loading into target [MY_TARGET_TABLE] : Bad rows exceeded Session Threshold [1]
 WRT_8333 : Rolling back all the targets due to fatal session error.

Resolution: This errors occurs where are more then one transformation errors that occurs with the input data. If you set Bad row  session threshold (Set on Error = 5) in the session properties then upto 5 errors rows will be ignored and the 6th one will result in session to fail. To resolve this either increase the session error threshold or fix the mapping to handle the bad data.



19) Error: [Pre/Post Session Command] Process id 15555. The shell command failed with exit code 1.

Resolution: the pre or post session command failed and the session is configured to fail if the pre/post session command fails. Find the reason why the command is failing and fix it. If it pre command that failed then most likely you have start from the begining. If the post
session command failed then you can just run that post command separately.



20)ERROR    2/19/2015 10:48:52 AM    mynode WRITER_1_*_1    WRT_8229    Database errors occurred: ORA-26002: Table OJ_X.MYTABLE_WORK has index defined upon it.
Database driver error...
Function Name : Prepare DPL
SQL Stmt : INSERT INTO OJ_X.MYTABLE_WORK (COLUMN1,)  VALUES ( ?)
Oracle Fatal Error

Resolution: This error was caused because of Bulk mode turned on in the target setting in the session. Change it to Normal mode and it should solve it.


 21)*********** FATAL ERROR : An unexpected condition occurred in file [/export/home/builds/pc9x_root/910HF/build/powrmart/common/utils/ublkdesc.cpp] line [338].  Aborting the DTM process. 

Resolution: There are few resolutions that I found for this. For me, this issue was caused by insufficient memory so change the DTM buffer size to Auto looks like fixed it. Everytime I recovered the session, it used to work. There are other resolutions that I found, https://community.informatica.com/message/53392
https://community.informatica.com/message/124778#124778
https://community.informatica.com/message/124868#124868



22) ERROR  4/30/2015 5:55:34 PM    node01      READER_1_3_1 RR_4035               SQL Error [
[IBM][CLI Driver][DB2] SQL0805N  Package "MYDATABASE.NULLID.SYSSH200.5359534C564C3031" was not found.  SQLSTATE=51002
sqlstate = 51002


Resolution: Something to do with binding problem with DB2. http://www-01.ibm.com/support/docview.wss?uid=swg21574086 discusses about this. Issue /usr/local/bin/db2ls command to see the installation path of DB2. Ask your DBA to resolve this binding issue. After the issue is resolved, If your DB2 Powercenter module connection does not work, then try setting up odbc connection in odbc.ini and check if you can connect using odbc.


23:) ERROR  9/22/2015 2:26:45 PM    node01 WRITER_1_*_1 WRT_8229           Database errors occurred:
ORA-26002: Table MYTESTDB.MTYTABLE has index defined upon it. 
Database driver error...
Function Name : Prepare DPL
SQL Stmt : INSERT INTO MYTABLE(BATCH_ID,BATCH_START_DATETIME)  VALUES ( ?, ?,)
Oracle Fatal Error

Resolution: This happened because I was trying to insert data in bulk mode to the oracle target that has index defined upon it. I changed the target load type to Normal and it resolved the issue. You can do this in the session - > mapping -> target properties for that target table.


24: Issue with importing workflow or mappings in Informatica Cloud services: The below message appears:

Select the PowerCenter Workflow XML that was exported from PowerCenter Repository Manager.
  • javax.xml.transform.TransformerException: Unable to evaluate expression using this context
Resolution: The above message appears because you did not export the objects from the repository manager. This issue should be resolved if you export it from the repository manager and then try to import it into Informatica cloud services as a power center task.

Data WareHouse and Business Intelligence Project Activities in Waterfall/Agile Methodology



Initiation phase


Requirement Analysis


Design


Activities:

  • Cost benefit analysis and ROI calculation

  • Project feasibility analysis

  • Business case study

  • Project cost estimation

  • Resource identification and allocation, other project management tasks

  • Vendor identification

  • Proof of concepts


Activities:

  • Risk Assessment

  • Info Security Assessment

  • Data Dictionary

  • Capture Business requirements

  • Source Data Analysis

  • Defining source system format

  • Capture Non Functional requirements

Deliverables:

  • Business requirement document, Functional specification document ,

  • Data dictionary, etc


Activities:

  • High Level Architecture Design- Architecture Design artifacts , Project Technical review

  • Detailed design- Application Design (ETL/Report Design), Database Design

  • Data profiling, Data Quality analysis

  • Proof of concepts

  • Source Data Analysis

  • Source Target mapping

  • Logical/Physical Data Modeling

  • Design of workflows and schedule

  • Design walkthrough, review and sign offs


Deliverables: Design documents, data models, data mapping documents, etc






Testing


Implementation


Activities:

  • Prepare test strategy, test plan and test cases

  • Test document review and sign off

  • Test data set up

  • Test scripts creation

  • System integration/functional testing

  • Data validation

  • Performance testing

  • Defect logging and tracking

  • QA testing sign off

  • User acceptance testing and sign off



Deliverables: Testing documents, Defect summary, etc



Activities:

  • Prepare Implementation plan or deployment guides

  • Raise change management/implementation tickets and get necessary approvals

  • Prepare runbooks

  • Support implementation group during implementation

  • Knowledge transfer

  • Provide post production support during warranty period


Deliverables: Production implemented code, run books, support manuals, change tickets, etc





Sunday 20 July 2014

Business Intelligence - A simple definition and overview

Business Intelligence (BI): tools and systems that provide business the capability to store, analyze and understand business data to make good business decision, generate report/dash boards/metrics for better understanding of company data which helps to improve efficiency and drive revenues.

Typical application of business intelligence:
a) Metrics of gross revenue, profits, sales, etc related to financial and sales analytics
b) Customer behavior profiling
c) Order management and supply chain
d) Geo-Spatial analytics
e) Regulatory complaince
f) Human Resource management



The source of data for Business Intelligence:
 Operational data
 Web services
 Data marts/ODS and enterprise data warehouses
 Social media websites (Facebook/Twitter)
 B2B includes feeds from other business
 MDM data- Master data management data
 Geo-Spatial information
 Pervasive computing devices such as sensors, cell phones, etc.


Business Intelligence tools:
 a) Reporting/Dashboard tools such as Business objects and microstrategy
 b) Data mining tools such as Weka
 c) Statistical analysis tools such as SAS/R
 d) Big data tools such as Hive/Impala etc for analysis of huge volume of data

There are obviously other complex tools that can be used for business intelligence. The tools above is what is normally found in industries.

Recent trends:
a) Moving business analytics to cloud such as Amazon redshift.
b) Using Big data for handling huge volume of data and offloading work from traditional data warehouse appliances.
c) Mobile BI applications
d) Social media analysis

Success Factors:
a) Management involvement and sponsorship
b) Focus on solving business problems
c) Stable and efficient BI environment


Required skill set for building business intelligence systems:
a) Report/dashboard developers.
d) Data modellers and data architects.
c) ETL designers/developers and understanding of data warehousing concepts.
d) Statisticians/Data scientists or Data Analysts.
e) Testers and Business analysts with background in data warehousing and business intelligence


See also: http://dwbitechguru.blogspot.ca/2014/07/a-standard-etl-architecture.html



Saturday 19 July 2014

Consuming WebService In Informatica

To read data from share point or in general from any web service, use the following steps:
1)    Create a web service consumer transformation using the mapping designer.
To create the web service consumer transformation you need the wsdl (web service definition language) file of the web service you are trying to connect. In this case, it is the wsdl from the web service you are reading from sharepoint. Please ask the web service developer to provide you the wsdl file.  In the example below you are reading the GetListItems webservice.
Using the wsdl, create the web service consumer transformation. Configure the end point URL in web service consumer properties tab. Also, enter the End point URL.



2) Add the web service transformation to the mapping. The response of the web service transformation can then be parse by an xml parser. To create the xml parser use the xml from the getlistitems web service. Ask the web service developer the XML or use the web service consumer created above to write the XML to a flat file. Now take the XML from that flat and use XML editor such as xml spy can help to create .dtd files required to create the xml parser. Parse the XML using Informatica mapping to get the desired output.

 

3) Finally, in the workflow the connection to the GetListItems should be configured properly. The GetListitems connection will be application connection of type web service consumer. The End point URL should be configured for this connection.

 Check out how to create xml parser in Informatica using XML Spy:
http://dwbitechguru.blogspot.ca/2014/09/how-to-create-xml-parser-in-informatica.html