Monday, November 30, 2009

Security

It struck me today that one very important topic in system integration is security, but that this is one that was largely missing from most of our topic presentations and discussions outside of single sign-on & SAML.

I did a couple quick searches through our texts and found an interesting quote in one of our texts:

From "Enterprise Application Integration" (Wiley), Chapter 1:

"In the 1998 FBI/Computer Security Institute Computer Crime and Security Survey, 64 percent of respondents said their enterprises had been successfully attacked. Data modification occurred in 14 percent of the attacks. Quantifiable losses reached $136 million."

 If things were bad in 1998 I would guess that they're worse now (just a gut feeling, not backed up by data in any way!), so I think that we must be aware of security issues if/when we are ever involved in the design, development or use of an integrated system. A few quick google searches show that there are a LOT of resources (or at least articles) covering security related to SOA, SOAP, XML, web portals, etc.

I know that in my own experiences with integration solutions (see my prior posts) I regularly handle rather sensitive personal information. The way that security is handled in these cases is largely handled by encrypting the data prior to sending it, effectively making any security issues an "internal" matter.

I'm curious if anyone has any particular examples of security done well or poorly in an integrated system?

Friday, November 27, 2009

Supply Chain E-Procurement Application: An example of B2B Integration

I work with our company’s enterprise e-procurement solution, which I will refer to as E-Market (not actual name). This is an excellent example of B2B integration because it involves not only the company I work for and the software vendor of the E-Market product but it also involves each of the product vendors that we purchase products from. I will attempt to explain overall how this product works in the real world and who are the important players in the process.

Below I have an architectural diagram of the systems and people that are involved:



(click on image above to enlarge)

End User Experience
I will start first with the end user. They log onto the E-Market application via the web on their local machine. After logging into E-Market the user is able to purchase items from the different product vendor’s websites (i.e. Staples, Dell, Office Depot) through what is known as ‘PunchOuts’. To the end user the user interface for selecting the products from the different product vendor’s websites looks almost the exact same as if the user logged directly into the product vendor’s websites without going through the E-Market application. However the difference is that once the user has selected all of the items and hit submit, instead of that order being sent to the product vendor…all of the information is actually transferred back into the E-Market application for final processing (i.e. selecting payment method, shipping location, approvers of the order if necessary, etc.). After the user has finished entering all the necessary information into E-Market the order is submitted, and the user’s work for the order is done. Then the order is submitted to the product vendor. The End user may speak to the Application Support if they are having issues or may speak with the Supply Chain Department about potential product vendors that can be added as 'PunchOuts'.

Application Support for E-Market Web Server/Database
This person is responsible for the upkeep of the E-Market web server and making sure that the system is always up and running and can provide support to the end user as well. Responsibilities also include applying any patches or updates to the physical systems and applications provided from the software vendor of the E-Market application. The physical systems can either reside on-site at the company who uses the E-Market software or the software vendor of the E-Market application can host the company’s web/database systems. The latter option introduces the possibility of cloud-computing.

Software Vendor of E-Market Application
The software vendor is responsible for the complete software life-cycle of the E-Market product. They are the ones that provide the updates and patches to the ‘Application Support’ of the system to be applied. If the system is hosted then the software vendor will both provide and apply the patches. The software vendor of the E-Market application not only works with the ‘Application Support’ for providing the fix but more importantly they work with the product vendors whose websites are visited by the end users. This is truly where the B2B work is done. The software and product vendors must work together to establish standards that will be used to address communication protocols between the systems (E-market system and product vendor’s systems). Within the procurement purchasing process some of the following standards are used:

XSL – standard for the documents that are sent between the systems (contain purchase information such as description of items ordered, quantity of items ordered, price of items ordered, etc.)
cXML – standard protocol used for the communication of data between the systems.

Product Vendors (i.e. Staples, Office Depot, Dell)
These vendors work with both the vendor of the E-Market Software as well as the Supply Chain department representatives of the company that uses the E-Market Software. The first relationship (product vendors <-> E-Market software vendor) was described just previously. The second relationship (product vendor <-> Supply Chain Department) is primarily in place to agree on which product vendors will be available to the end users. This can be driven by request from end users to add certain product vendors or can come between relationships of the Supply Chain department and product vendors. Interestingly enough this is another point of strong B2B integration. This gives both the product vendor and the company using the E-Market software insight to each other’s processing patterns. This will help understand how much money or how many transactions are occurring in particular with which product vendors and with which specific end users. This is important for both the company (to strike better deals on products) and the product vendor (to understand customer demand), as well as a host of other items that can improve on the overall efficiency of supply chain management.

References:

cXML ORG (Excellent reference to discuss more on the standards/protocols used. This also defines some of the words used above.)
Wikipedia: E-procurement (General background on e-procurement).

Monday, November 23, 2009

A real-world example of integration that needs work.

I've been thinking about integration solutions from the user perspective, since it turns out I use a fair amount of these on a day-to-day basis. Particularly, I've found myself thinking a lot about the importance of finding a good solution, one that not only works properly but also functions well, especially from the user's perspective. I have the...um...pleasure (I use the term very loosely!) of using rather sub-par integration solution on a daily basis at work.

We receive large amounts of data about applicants in digital format every day. The program (which I will simply call "Solution X") can be used to import data in a wide variety of formats such as non-delimited text in a flat file, XML, or delimited text. All of the data inputs for which we use Solution X are non-delimited flat file formats. I think there are 3 we use regularly with this program. Solution X is rather extensible in that you can set up a different profile for each format. I haven't done this part myself, but I'm told you don't need any particular level of expertise and you don't even need to be a programmer (though I'm told it helps). The data ends up in an organization-wide Oracle database.

The program examines the data in 2 "passes", the first of which is almost completely automated: the user loads the program, loads the appropriate profile, loads the file and starts the first pass. The execution time for each record varies depending on the file format complexity and the resources of the computer (a desktop pc) running the program. If there are no errors then the total execution time for the first pass is simply the number of records in the file times the length of time per record. If there are errors then a user may have to dismiss a dialog box, which can stall the first pass. Once the pass is done the records are in one of three states: matched, new, or unmatched. Matched records were determined to belong to an existing applicant record and the input data has been added to their file. A new record was determined to not belong to an existing applicant record and a new record was created for them. An unmatched record couldn't be automatically determined to be either a match or new, so the user must make the final decision. The second pass through the file loads each unmatched record and gives the user the opportunity to decide if the potential matches are in fact matches.

So far so good. Well, sort of.

First we have some practical problems. One is that the program is not very forgiving when the input data changes. Even very small differences can require fairly significant work either by IT staff to change the profile or by the user to manually alter the input data. Unfortunately because of project priorities we end up manually changing the input data on a daily basis.

Another problem is the fact that when errors occur during first passes they can easily halt the entire process. This is really a major issue particularly when very large files are received. For example, yesterday evening we received a file with about 1200 records that takes, on average, about 30 seconds per record on the first pass. That's 10 hours if there are no errors in the first pass. I will often start a first pass on a large file at the end of the day, and sometimes I come back in the morning to find that the program has been displaying an error dialog since record 23.

Also difficult is the fact that a record being flagged as unmatched might not mean that the record couldn't be matched or determined to be new. It could be that there was some problem with the input data that is enough to prevent loading but isn't enough to warrant informing the user. This requires the user to look through cryptic log files and fix the problem manually.

As you can see, Solution X is functional. It does what it needs to do, albeit with some effort, and perhaps irritation, on the user's part.

The bigger problem, at least in my opinion, is that Solution X was obviously not designed for processing large numbers of files/records, or even with efficiency in mind! You cannot, for example, have two people working on the same input file at the same time, nor can you run multiple instances of the program at the same time (e.g. to work on two different types of files simultaneously). The only way to reduce that 10 hour initial pass time is to manually split the input file up and then run multiple instances of the program on different computers. This means that to get the first pass of that file done by the end of the day we need to dedicate at least two computers to the job.

The biggest problem is that everything is started by the user. The user must load the program, the profile, the file and then start the process. Because the program can get stuck rather easily the user has to check in to make sure that the "automatic" part is even running! For larger volumes like this it seems that we need a solution that runs as a daemon and takes care of the "easy" stuff itself...

The following is what I think Solution X should have been to begin with. Let's call it Solution Z. Solution Z is a two-part program utilizing a server-side daemon and a front-end GUI application. The server side daemon continually monitors a drop folder. When new data is available it is automatically retrieved, decrypted and put into the drop folder. Sometime in the next minute the daemon automatically loads the file and initiates the matching/loading process. As with Solution X, those files that are clearly matches or clearly new are loaded automatically. Records that are flagged as "unmatched" or with malformed data are placed in a queue. The user can load the GUI app to pull unmatched or malformed records from the queue as time permits. For each record that is pulled the user with make the decision about matching and/or correct any malformed data. Once these operations are performed the records are resubmitted to the daemon for loading. The use of queues for the error corrections means that multiple users can work on a single input batch at the same time. The use of a server-side daemon for loading allows for greater automation and parallelism in the processing of an input file. For high priority records you could load the GUI app right away and pull from the queue frequently in order to complete the whole batch as soon as possible. For lower priority records you could wait for the queue to build up. If the server-side daemon is run as a job on a server (as opposed to running the "server" process on a desktop workstation) you would likely also see a significant decrease in processing time per record. Even if this doesn't change appreciably you would almost certainly benefit significantly from the fact that you don't have to wait for a human to start the "automatic" first pass.

Thoughts?

Saturday, October 31, 2009

Business Process Management Example: Selling cars at a dealer’s auction.

This is an example of business process management and shows a snapshot of a software tool that is used. One of the companies that are a subsidiary for the company that I work for is responsible for auctioning off cars. So for the example of business process management I will discuss the business process in getting that car from the lot to the auction block. This explanation will be very high level as to provide the important details in the process and will not cover the intricate details of every step necessary to sell the car. So the steps I provide may not be the exact steps that the company takes (and the software tool is not used by the particular company), but the purpose is to illustrate how business process management works. I will use the term ‘Company’ to refer to the business responsible for auctioning off the car.

At the beginning of the process the car will be received from an auto salvage shop location. Once the car is delivered to the Company it will be necessary to take pictures and collect information about the car to place into the Company’s records. This information will contain data such as the make, model, year, color, and any other important information about the car history. After the information is recorded the next step is to prepare the car to be sold at the auction. This may require repainting, dent removal, or even significant body work (or course the car should be sellable at a value higher than the work performed). All of this information is recorded and the car is then passed off to the necessary departments to have the work done. After all the work is done and the car is ready to be placed on the auction block another set of pictures is taken and entered into the Company’s records. These pictures will be the one made available to dealerships that may be interested in purchasing the car. The car is then shipped off to the location of the auction and on the day of the auction the car is sold (hopefully!). Once the car is sold there is another set of paperwork done to record buyer and price information as well as any other pertinent information about the trade. This information also needs to be entered into the Company’s records. That is the last piece of information that is collected in the Company’s records about the car.

The software that I used to illustrate the process is called Stellent Imaging and Business Process Management (IBPM) that can be used to facilitate this process. The tool that is used is called Process Builder and you can read more detailed information by clicking here (starting on page 423). The software allows a lot of powerful functionality and for the sake of time and understanding I have only designed a simple diagram that illustrates the business process. I have listed come reference URLs at after the picture for additional information.



(click on image above to enlarge)

References:

Oracle Imaging and Process Management (Stellent IBPM)

Oracle Business Process Management Suite

Oracle/Stellent Documentation

Tuesday, October 27, 2009

Google

Google really does offer a lot of cool stuff in the area of integration. Like this one for people who use Google Voice (screenshots*, not the real thing!):



It's a "Call Widget" that lets people call me without knowing my phone number. Click it and you are asked to enter your name and phone number.



When you click the "Connect" button you get a phone call at the number you entered and then it will ring my phone number. I can also set up the widget so it goes directly to voice mail. There's also some other cool stuff, but I don't want to sound like an ad, so I'll stop now. *grin*

I don't know how the underlying program works, but I'd say that this "widget" would also likely meet the broader definition of a "portlet" as discussed in our presentation last week.

* I am not posting the real widget simply because I don't want every random web surfer in the world to be able to leave me messages or call me for no good reason.

Sunday, October 25, 2009

CIS 8020 Assignment 2 UT Olympic Guide Google Map

With the Olympics coming to Rio De Janerio in 2016 MO Tourist Attractions (imaginary company) wants to be able to provide tourists the ability to obtain quick and accurate information on the excellent places to dine, best to shop and locations of all the events. A graphical interface would provide the tourist a much better opportunity to visually get an idea of these ‘hot spots’ than just reading some text-based article or brochure. With leveraging the simple API as below the users will have the option of selecting if they want to see only places to shop (represented by ‘S’), places to dine (represented by ‘D’) or place of Olympic events (represented by ‘E’).

Google Map API is an great candidate solution to address this issue. It will generate a static map that can be integrated to Mo Tourist Attractions website and allow the users to select the different options (dine, shopping, events) that they want to see. A prototype is below:



When compared to other solutions , this option would be quick to implement and doesn’t require a lot of work on the back end of the website. It can be published and modified as necessary by any one of the agents at the tourist office and does not require advanced knowledge of website design. The time that it takes to load the map is fairly quick and will be very low maintenance.

Additionally MO Tourist Attractions can look at expanding the Google MAP API by integrating the Google Calendar API. This will allow the mapping of times and specific olympic events to locations and also any special occassions that may be happening at the dining restaurants (i.e. Olympic Atheletes dining at certain restuarants).

Friday, October 23, 2009

CIS 8020 Assignment 2 PW Visualization API - Charts

A college admissions/recruitment office typically receives applications for three different terms and from three different types of students at any given point in time. The department management has requested a quick way to visually demonstrate what combinations of types are terms are being received from week to week for use in a newsletter that is posted on the organization's internal website.

The Google Visualization API is an excellent tool for implementing this solution since it can automatically create interactive charts to display the data. In this example I created a simple spreadsheet in Google Docs that contains the data for the week. Two charts have been created in order to show the data both by student type and semester:







This style of implementation was chosen because the code to create the charts only has to be written once. The person who writes the newsletter never needs to touch the code, they simply update the data in the Google Docs spreadsheet and the charts are instantly updated.

A less elegant and more labor intensive solution would be to create the chart in a desktop spreadsheet program (e.g. MS Excel or OpenOffice.org) and then manually generate an image that would then be uploaded to the web server and included on the webpage.

NOTE: There is a compatibility issue with this code! The charts display without error under Firefox and Google Chrome, however they do not appear under Internet Explorer or Safari (Windows).
As such, I recommend that you try viewing this post with Firefox. Actually, I recommend that you try viewing everything with Firefox!;-)

Sunday, October 18, 2009

Future of Portals?

What I found to be interesting in my research of portals and portlets are around where this technology will be in the future. The basis around this curiosity came from the article here: http://blogs.zdnet.com/BTL/?p=4912: “The future of portals is mashups, SOA, more aggregation”.

When I first started reading this the question that came to my mind is: “Will mashups replace portlets within a Portal?” We discussed mashups a couple of weeks ago and one of the questions posed at that time was will mashups replace ERP? (http://sysintteam.blogspot.com/2009/10/can-mashups-replace-erp-enterprise.html). After reading more about the topic of the future of portals I saw that really what the blog was proposing was that in the future mashups will be used to actually aggregate multiple portals together. Imagine being able to use a portlet within Google within your own company’s enterprise portal. I actually found an example here (http://blogs.sun.com/javacapsfieldtech/entry/healthcare_
facility_mashup_portlet_with)
where a company is looking to use what is called a ‘mashup portlet’ to leverage Google maps to show facility locations. Google actually provides their APIs at an enterprise support level, of course with the purchase of a license, so that companies can integrate a solution which is already proven to be some of the most popular tools (http://www.google.com/options) for personal and professional activities. In addition to the many advantages that this will offer companies there are also some concerns. Personally I think that one of the biggest concerns would be around security. Due to the fact that the different tools provided by Google are so popular, there is a very large audience that has ‘interest’ in them. With the API being available to virtually anyone this can pose to be a security threat to any company leveraging the tools provided by Google. However for one to be able to exploit the Google API they would have to first get past the security of the company’s Enterprise Portal. Nonetheless security should still be an important point considered when integrating third-party applications within a portal.

Saturday, October 17, 2009

Portal & Portlets

On Tuesday we will be discussing Portals and portlets with UI Integration as the focal point. With the initial research being completed I couldn’t help but notice that portal is definitely a buzzword with more uses than swiss army knife. If you doubt me…try typing in ‘portal’ at Google and see if you can find a word that returns more hits. But I believe despite around all the hoopla on the word portal…at the core of the word (in reference to technology) is integration. Portals allow for the integration of different applications to provide the end user one central location point for access. The applications that are integrated within the portal could also be referred to as portlets. Web Services for Remote Portlets (WSRP) a standard used for portlets will be discussed as well.

There are instances when a portal contains the full-fledged application integrated within the portal and then there are instances when the portal only serves as a gateway to the application. The two instances previously mentioned are defined as integration on an application layer or full integration and presentation layer or more of a shallow integration. The implications of these will be discussed in class.

There are some links for general reading of portals and portlets below.

Web Portal
Portlets
Web Services for Remote Portlets (WSRP)
MSDN: Portal Integration
Portals and Web services: Services at the user interface

Monday, September 21, 2009

XML Standards and Vocabularies (Applications)



(due to the galactic nature of the word XML we are zeroing our discussion to the topics below):

For our discussion in tomorrow's class we will focus on XML Standards and Applications. Specifically we will look at Electronic Data Interchange XML (EDI XML) & Electronic Business XML (ebXML). We will talk about the utillization of the two applications as well as the architecture. Also we will speak on benefits and some examples. We will also compare/contrast the two different technologies and finish the discussion with some thoughts on the future of both EDI & ebXML.



Some good reference material for the discussion is listed below:

ebXML.org

EDI/XML (wiki)

Index of XML Standards

Thursday, September 17, 2009

XML..to infinity and beyond!


For our discussion next week we will cover XML based standards and applications. We will briefly talk about various different XML vocabularies and in reference to different industries and the capability of web services. There will also be discussion around applications which use XML based standards. Below are some more resources that I encourage everyone to look at.



Jean Paoli (one of the co-editors of XML) comments on where XML is today, and moving forward:





Source : YouTube



Some XML sites that contain background information:



Extensible Markup Language (XML)

XML Standards Reference

XML standards and vocabularies

...and I shall follow...

I am Umoja Thomas...aka MO BLOGGER. I am in my third semester in the Flex MBA program with a concentration in Information Systems. Although born in NY I was raised here in Atlanta. My undergrad degree is in Computer Science at N.C. A&T State University (AGGIE PRIDE!). I have worked for the past 8 years in IT and currently work in Application Support/Engineering.
I enjoy listening to music and gaming (when I can). I also enjoy a good game of chess.

Tuesday, September 15, 2009

A brief introduction to your humble contributors.

Greetings! My name is Peter Weber. I just started my first semester in the MS in Information Systems program. My wife and I moved to Atlanta from the Chicagoland area just over 3 years ago and I've been on staff here at GSU for the last 2 years. My undergrad work was in Psychology, Sociology & Computer Science. I worked in mental health for about 5 years and did some graduate work in Clinical Psychology in Chicago, but eventually decided to change careers. I'm currently working as the supervisor of Initial Processing/Operations in the Office of Undergraduate Admissions.

I spend most of my non-working and non-school hours being the proud daddy of my 18-month old daughter. My favorite hobby is playing my saxophone, though I don't get to play much between work, school & parenting! My musical tastes are rather eclectic, but by far my favorite genre is jazz.

---