E-commerce and Linux 173
paRcat asks: "My company is using a proprietary system for letting our customers order online. It takes the order, and as soon as they click submit, sends it on to our main system via a serial connection. Both systems are running on NT, and they die constantly. I've been pushing to get Linux in here, and I think replacing the online ordering server is the best way to start. Our catalog has around 25,000 items in it. It's held in an Access database right now... around 14 Megs. I suppose it could be converted, but every time a pricing update comes out, it's distributed in mdb format. What tools exist for Linux that can do what I need? It just needs to allow access to the database, take the order, and send it down the serial line. I was contemplating setting up mod_perl and just writing a bunch of code, but I'm still a bit new to PERL, and I'm not even sure if that's the best choice. " Apache, Perl/mod_perl and MySQL/Msql/PostgreSQL all sound perfect for this application, but the issue of getting the Access information (both the existing data and the future updates) might be a problem. Any ideas?
MySQL (Score:1)
Coverting between access... (Score:1)
Re:MySQL (Score:1)
export the data from Access as CSV or tab-delimited and then import it with mysqlimport.
I think the hardest part of all this would be programming the serial interface.
Access to MySQL (Score:2)
Second: As far as I know there is a middleware to convert access databases to MySQL data. This should take care of the problem with two db's. The only problem would be that it is a total db dump so it could become more and more unwieldy as the db grows.
Third: You can also connect the Access db to MySQL using MyODBC if so is needed http://www.mysql.org/Manual_chapter/manual_ODBC.h
With hopes this helps...
/Lindus
Access to MySQL (Score:1)
First: Check http://www.mysql.org out. It answers a fair amount of the questions you might have already.
Second: As far as I know there is a middleware to convert access databases to MySQL data. This should take care of the problem with two db's. The only problem would be that it is a total db dump so it could become more and more unwieldy as the db grows.
Third: You can also connect the Access db to MySQL using MyODBC if so is needed http://www.mysql.org/Manual_chapter/manual_ODBC.h
With hopes this helps...
/Lindus
Access will do this via ODBC (Score:2)
Then just copy and paste.
You can automate this in about 5 lines of visual basic.
On the down side, you do need to do this on 95 or NT.
Access Databases (Score:1)
Lando
Interesting problem (Score:1)
Re:MySQL (Score:1)
Re:hahaha (Score:1)
This should not be a problem (Score:1)
1. You can export the Access table to a CSV file(comma separated value) and then import the file to the database(with database suplied tools or a script).
2. PostgreSQL and probably all the other Open Source Databases have ODBC drivers available which allows MS Access to link directly to the database server.
Re:Coverting between access... (Score:2)
roll your own solution (Score:1)
perl or php3 would probably be good choices for the language, as both are nice high level things that interact w/ databses easily
i strongly recommend not attemtping to use minivend though, as its a major PITA to get working, and the documentation is old, brief, and in some cases wrong
Converting Access to MySQL (Score:3)
Access allows you to get away with somewhat sloppy data modeling, so you'll need to revisit the generated SQL and make sure you have all the "NOT NULL" and different data types in place as you need them (this can all be tweaked in the VBA without too much effort, but you should be knowledgable in VBA and SQL). You no doubt know that the MySQL data types are more specific than the data types used by Jet (the actual db engine Access uses). This means that you'll want to make sure that the column that was exported as "REAL" shouldn't really be "MEDIUMINT" instead.
The script generates SQL files which can then be used against MySQL in batch mode. So with it, you can set up the VBA function to create the appropriate SQL for refreshing your tables whenever you receive a new MDB.
My suggestion is to dump Access. It's nice for rapid devolopment, but if you're running an e-commerce thing with it, then I pity you. Not only would NT be your problem, but you'll have Access' sloppy page-level record locking and it's tendency to not release connections and record locks with the consistency e-commerce apps should have the right to expect.
PHP (Score:1)
with php4/zend on the horizon, php is looking to be an increasingly robust and viable alternative to PERL.
----- --- - - -
jacob rothstein
Suggestions (Score:3)
- It's there. He who gets there first has the home-field advantage
- It handles the Access data import needs without any problems (or at least you didn't mention any)
- The usual PHB tendency to swallow MS FUD(tm) will probably work against you.
That having been said, here's a start on countering it and working up a case:
- Definately look into ODBC or some sort of easy export from Access. I'm not familiar enough with the MS world to know a sure solution, but I imagine the worst case solution is some sort of pretty simple VBA scripting in Access to open the file and dump out selected records (or all records) in a nice format MySQL can import.
- I agree that MySQL is probably a good client database. Don't rule out other options, however, you don't want to find MySQL doesn't fit your needs and then have to propose *another* change to management. PostGres has worked well for us in some applications, and is a little more full-featured than MySQL (although not as fast, and feature-wise it's actually a pretty close race). Oracle on Linux is even a possible choice, but you haven't mention budget or database size. Since it's coming from Access, I'll assume it's a small database -- in that case, MySQL looks pretty good.
- Web server: Apache or Stronghold if you need SSL. We just started using Stronghold, and so far it seems dead easy (much better than Netscape Enterprise Server which was the only SSL solution we had tried before this). It's not expensive as far as SSL solutions go, and it seems to track Apache releases pretty well.
- Application coding:
Perl is great, but you say you're not experienced yet. That's not a show stopper at all, but consider carefully how to proceed. If you're willing to pick up the Camel book, the Ram Book, and a printout of the CGI.pm and mod_perl documentation, you may be ready to be a Perl Web Programmer
As far as non-Perl solutions, Python is great, and Zope seems to be getting more and more attention. Check out Zope and scour freshmeat for other Web Application architectures, you may find those solve some of your problems well. Java servlets are an excellent choice also, but expect those to require much more programming savvy.
Anyway, the first step really is to analyze the current system and figure out all the components. For each one, pick a few possible substitutes. Play around and convince yourself that the pieces you select play nice together. Then show *that* to the PHB and get the go-ahead, they tend to be pretty easy to convince if you know your stuff...
What exactly is causing the crash (Score:2)
--MD--
Access to ODBC to PostgreSQL (Score:1)
For another client I eventually made a web-based editing interface and eliminated their Access database altogether. Works even better :)
Ah, so you are volunteering, then... (Score:2)
Show us why you are better qualified than anyone for the job.
--
Comments on web-applications (Score:1)
Also, if you're going to be learning Perl for this application, might I suggest using Python instead? It can do anything Perl can, but its a considerably cleaner language (read: more readable), and there's also a Apache module for it.
Perl's fine for programs around 50-100 lines or so, but larger than that and it starts to be come really messy.
Re:Access to MySQL (Score:2)
Here is a total solution (Score:1)
http://yams.screamdesign.com/
Export in MySQL format from Access (Score:1)
Either use MyODBC as someone pointed out or you can use a script available here [informate.com] to dump an Access DB into nicely formatted MySQL compatible SQL.
There are a couple of these utilities available on the MySQL web site [mysql.com].
MS-Access / Visual Basic (Score:1)
FYI: The choice I have made for my company for end-to-end web based ordering is Linux, Apache, mod_ssl, PHP, MySQL, Cybercash. PHP allows you to link right against the Cybercash MCK and even comes with some nice pre-written interface functions.
Good luck. Moving away from Access (even toward MS-SQL server or otherwise_ will help you A TON. Win NT ODBC drivers are really really poor.
---
Don Rude - AKA - RudeDude
Been there, done that (Score:5)
The perl was a piece of cake to move over - we even switched to mod_perl along the way. The database, however, was a bit of a pain. We moved it first from Access to SQL 7, then used SQL 7s data export function to stuff it into Informix on the Linux box. It was a nightmare, there are so many things that just don't move across. Views, identifiers longer than 18 characters, etc.
My suggestions, from hard-won experience are:
This is a good lesson in why to create portable applications. Just move in pieces and you'll gradually see your system get more and more stable, without getting above your head in new things to learn.
Good luck.
Re:Comments on web-applications (Score:1)
I disagree about your assessment of Perl - I say it's fine for programs of about 500 lines after which you are probably done. Hooray for mod_perl!
Who's to blame here? (Score:1)
I don't know what technologies you're using other than Access, which is seldom a good solution for a website. But NT works well for us. As with any public server, you'll want to pare it down to run exactly what you need, no more.
I do root for Linux, though, so if you go that way, best of luck!
Re:Suggestion: (Score:1)
"Linux sucks, Windows rules!!! Especially the version coming out in 3 months that's been delayed for over 2 years! It will be the greatest OS ever!"
Linux, BSD, and NT are all scalable and reliable OS's as long as you have the right equipment and a competant SysAdmin, but it's not eh best solution for everything. Hell, even Microsoft uses UNIX over NT when it's better for the job (like in Hotmail)
use servlets (Score:1)
advantages over perl:
henri
perl interfaces: (Score:1)
contains DBD::ODBC. should allow you to connect to odbc driver on nt box.
Access..y stuff (Score:1)
Access exports into Some spreadsheet formats, text, rich text format, Foxpro, dBase and even outputs to ODBC databases,(though you'd have to see if you can get a driver for access from the manufacturer).
dBase is probably your best bet as far as real database formats are concerned.
ODBC drivers, would be interesting to see if MS Access would accept PostgreSQL drivers, or if it would revolt. Anyway, the Access help file has some info re. installing ODBC drivers.
If nothing else, most DBMS should import a text format table.
Good luck.
Hail the Penguin!
A suggested migration path... (Score:3)
You're on the right path... Perl can certainly do what you. In fact, most of the difficult code that you would need is already written: DBI & DBD::ODBC. What you'd want to do is establish an ODBC connection to your Access database & do all your data manipulation that way.
Using this, you can move the dynamic component of your site to Perl based CGIs running on your existing NT servers. (Take care to avoid anything platform specific, and avoid ASP/Perl-script. That will only furhter anchor you to NT.) Not a radical shift, and one that should both help your current situtation and increase everyone's confidence in what you're doing.
Next, on the side, implement you linux box running mysql. Write a Perl script using DBI, DBD::ODBC, and DBD::Mysql to periodically refresh the mysql database from the Access table. This will be the trickest part. I recommend keeping the mysql data model the same as the Access data model to keep things simple. Unfortunetly, there isn't an easy/inexpensive way to read Access files on a linux box, but that's not a big obstacle. You can use DBD::Proxy. Alternatively, have two Perl scripts: one on the NT box to export data to a tab-delimited flat file, and another script on the Linux side to import the file. With some smart scripting, the use of Net::FTP & NET::Telnet, you could integrate this into one file. (Simpler, but less cool, just do some intelligent automated scheduling on both sides via NT's scheduler & Linux's cron.)
Now, through mod_perl into the mix. Present each solution side by side to your superiours. Explain the performance difference, the scalability, the cost difference, the seemless integration, and the added functionality of the Linux/Apache/Perl/mod_perl/Mysql solution, and *presto* you have your solution (and you look like a superstar.
-Bill
Re:Comments on web-applications (Score:2)
There's a GPL MySQL out there, perfect for you then.
Yours Truly,
Dan Kaminsky
DoxPara Research
http://www.doxpara.com
Access to run a Business App?? (Score:1)
Also learn SQL you'll soon see that importing data from any ODBC compliant DB is either simple or logical or both. If you don't want to then install MSSQL on your NT boxes and run the Access Upgrade wizard to import the data into new tables.
Hire a NT Administrator. I work on both Linux and NT Servers and I can make a Linux Server as unstable as most NT Servers. Remember most people who call NT unstable do not know fully how to configure and maintain a NT Server. To make matters worse their are also many NT Administrators who don't know how to properly manage a NT Server. It sounds like your NT Administrators fall under the latter catagory.
Use an SQL server & ODBC (Score:1)
load data local infile '/home/me/table1.txt' into table table1 fields terminated by ',' optionally enclosed by "'" escaped by '\\' ignore 1 lines (col1, col2, col3, col4);
But check the docs to see if you need anything else. sed and awk are good if you need/want to sanitize the table a bit first.
Then use an ODBC driver on the client machines to allow updates. MyODBC works well with Access 97 but I've heard of some problems with Access 2000 (I think someone at Microsoft is working on a patch for the next service pack?). Make sure all of the tables you want to update have a primary key defined.
Read your SQL server's docs for access control - with MySQL, I added a user name, password and IP address triplet to the mysql.user table with 'N' for all of the global perms, and added a line to the mysql.db table allowing the user select, insert, update and delete privs to the given database from their IP address. I'm sure there are other, better ways of doing it though.
I prefer PHP (http://www.php.net/) for the WWW frontend, but that's just me.
Re:You're problem is Access.. not NT. (Score:1)
SQL 6.5 is much more stable. It has limitations, such as no more than 16 tables in a view/query, but it is preferable to 7 at this service pack.
The suggestion of MySQL and Linux with ODBC using Access is not bad, as long as Access doesn't crash and/or hang as it's famous for doing. But if you use ODBC with MySQL, you could use lots of front ends/middle men: PERL in Linux or ASP in NT etc.
Setting up a MySQL Server on Linux shouldn't be any more difficult than anything else, as this poster is suggesting. It may save you money in the front and you will learn something in the end. The cost of Linux is the cost of learning--a price always worth paying.
PHP not PERL (Score:1)
For the database server, I'd recommend PostgresSQL over MySQL (transactions are nice), or you could just keep the database on the NT box and use ODBC (although I'd recommend upgrading to a database server like Oracle or MS SQL 7, MS Access is NOT meant for what you are using it for.)
Fixing the problem (Score:1)
Make full use of that time to do the job right. I strongly suggest that you do not attempt to do the port yourself in a language you are in the process of learning. Everybody writes really horrible code during the learning process of any language, and you do not want that code enshrined forever in an application that's critical to your business. Hire someone who knows how to do this type of app, and perform a real analysis and design before doing the coding. You will be doing the second system, so you have a pretty good idea what the requirements are. You will, of course, follow my advice and bring in someone who knows how to write this kind of application; and when you understand the problem, the technology, and the data there is no better model to follow than the top down model: analyse, then design, then code. Of course even when you understand all the elements of your application the creation of the application itself changes the environment it lives in, and so some adjustments to the design, and perhaps even the analysis will be made. But these adjustments will always be more expensive if done after the coding has started then if done before.
You have a working system. Fix it then replace it with a well-engineered second system; and your management will be pleased. Replace it with another hack, and nobody will be happy.
Yeah, that's a great idea. (Score:1)
Ryan Minihan
ryan@linuxhardware.com [mailto]
Re:Comments on web-applications (Score:1)
If your Perl programs are unreadable after 100 lines, you're doing something very wrong.
MySQL (Score:1)
Also, as their site says: Our connection to the world has been down between Saturday 19991030 GMT 22.32 to Monday 19991101 GMT 17.10
---
Accessing Multiple Databases on Multiple Servers (Score:2)
I'm currently in the throws of the same type of project as you.... The only addition that I have is I need to include an accounting system written in Visual Foxpro. My requirements are to access a MySQL system that's our main database system, with some information needing to be handled via MSACCESS, and needing to query the Foxpro based system as well. To make it even more challenging, the Foxpro based system HAS to be on NT server per our support contract with the vendor! Nasty, but there's a way to make it all play nice!
I decided to use Perl instead of PHP mainly because of Perl's flexibility, runs on NT and Linux with little changes, excellent performance, etc. I'm not trying to start up a holy war here, use what you're comfortable with. The biggest hurdle was accessing Foxpro system on NT. For that I used OpenLink Multi-Tier 3.2 [openlinksw.com] on the NT box. All you'll need to do there is configure the ODBC driver for Access. If you can read the *.mdb file with Excel, then the ODBC driver is configured properly. Install the OpenLink on the NT box - read the docs that come with OpenLink, it's also straight forward. You'll also need to grab the UDBC & ODBC stuff from OpenLink as well. That needs to be used to compile the DBD::ODBC perl module. Again, RTFM it's all in there. I did run into one glitch on an unrecognized command when compiling DBD::ODBC with UDBC. I only tested the module against Foxpro tables, and it did generate some errors. They were all due to SQL commands being longer than 80 chars - not a problem in my environment though.
Once that's up and running, and you've decided to use Perl - head over to the DBI/DBD Faq [hypermart.net]. Section 3 & 4 covers what you need to do. It's really not as hard as you think to connect perl to use ODBC, even on a seperate NT server. Connecting to any other ODBC compatible database works the same way.
From what I've read, PHP is just as easy. I first started developing our databases using PHP, but switched because I have other projects that Perl can be used, but not PHP. Perl looks to be slightly more complex than PHP for straight database access, but so far it hasn't been that bad. Some of the PHP code that I have looks real close to perl, so the switch isn't as painful as I thought.
I haven't tried this, but if you dump NT you could use iODBC to access the Access DB. All the parts are there, I've just never tried it. That way, you could still distribute your database in MDB format. Another option would be to keep a skeleton MDB around, and run everything on MySQL. When needed, you could just dump the MySQL tables out to the MDB tables and send it on its way. The second thing I did was set up a test network and just started working on it. I've been able to connect all the different databases together after about 12 hours of work.... Just to give you an idea of time and effort involved.
.mark
HTH
Re:Comments on web-applications (Score:1)
Re:use servlets (Score:1)
Java is not the answer for everything. It does somethings well, and has loads of functions built into it, but it doesn't fit well with CGI stuff.
MDSE (Score:2)
(Of course, being Microsoft, there's already been one service pack...)
Access Online == Bad Idea (Score:2)
If there are a lot of Access-based tools that are being used to "massage" the data internally, that is arguably not wonderful; what is crucial is that this not be used for the online copy.
You probably should consider using something like MySQL for handling the online data access for the web server; it would be entirely appropriate to build a process that synchronizes the online data with what's in your "back office" systems. This synchronization can add substantially to the overall robustness of your systems, as this can allow you to detect both:
It might be useful to build some abstractions behind the scenes like message queues like IBM's MQSeries, [ibm.com] on which Microsoft's MSMQ is based); a free tool that is commercially used that does this sort of thing is Isect . [netcom.com]
No MySQL for E-Commerce! (Score:1)
I personally like Sybase (11.0.3.3 is free to use in deployment on Linux), but Oracle or the other big names have also proven themselves in years of use..
Screw Access (Score:1)
Quite simple, really. (Score:1)
Re:Suggestion: (Score:1)
Re:MySQL is not appropriate for any serious purpos (Score:1)
--
Marques Johansson
displague@linuxfan.com
I agree; this lets you run in parallel, low risk (Score:1)
Another selling point to your boss is that not only are you replicating and improving on existing functionality under NT, but once on a Linux/Apache platform, a lot of zero-cost *additional* functions can be added as needed. You'll also be much better poised to start taking advantage of XML parsers, etc. when that need comes along.
Re:No MySQL for E-Commerce! (Score:1)
I use PostgreSQL for my personal web work, but think there's a lot to be said for using a tried-and-proven commercial database server when other folks' money is involved, as is the case with e-commerce. As is mentioned in this thread, Sybase has a no-cost version, yesterday's edition so to speak, available for Linux and it's good.
Re:MySQL is not appropriate for any serious purpos (Score:1)
every time I hear someone say that you need transaction rollback for a web enviroment i cringe...
Flamebait (Score:1)
Re:What exactly is causing the crash (Score:2)
You certainly can run an e-Commerce server on NT. My company sells an NT payments system, which is essentially the brick-and-morter version of an e-Commerce system. I'm not about to get into an argument as to whether or not NT is better for this because, quite frankly, I have my own doubts about that. (More than doubts, actually.) But that does not mean that NT isn't workable. We've got numerous installations that stay up for months, and given the load they get, I see no reason why an NT e-Commerce system wouldn't be similar.
One thing people need to remember is that "not as good" is not the same as "unworkable".
Re:use servlets (Score:1)
mySQL and Access (Score:2)
I definitely want to emphasize what many others have: DO NOT use Access for a multi-user application. It will work just well enough to fool you into committing resources, and then it will fall on its face. You are much better off getting the data into mySQL as soon as humanly possible, and then going from there.
D
----
Most likely ASP (Score:1)
A trappable error occurred in an external object?
This isually means the "World Wide Web Publishing Service" has fallen over (like IIS does only too well).
I've converted over to Perl under NT and it's a dream compared to ASP's or compiled CGI programs!
Though, if I had my way, we'd be using Oracle or MySQL under Linux instead of that lame excuse for an SQL Server from Microsoft - it's so dumb that it won't let you create a relationship between to tables within different "databases" on the same server! Thanks Microsoft!
Re:use servlets (Score:1)
It just seems to me, that the majority of web scripts are more like "hello world" then they are like complex scripts that would benefit from OO.
Re:Who's to blame here? (Score:1)
Re:Been there, done that (Score:1)
I designed an inventory system using MS access, also because I had no SQL servers set up and I needed the database immediately (pen-and-paper sucks).
While the MS forms and database I set up were being used for initial data gathering, I set up Roxen's web server on the windows machine and started developing a front end and some queries using the wonderful RXML enhancements, including the SQL tags Roxen includes (damn, Roxen's nice).
After getting some basic front-end parts up, I set up a linux box, and installed Roxen and MySQL (SuSE ships with both, but I rolled my own just to be on the bleeding edge and to get 128-bit encryption). Moving the web content to the linux box was 100% painless, since Roxen is virtually identical on both platforms.
Moving the database over was almost as simple, using one of the scripts a few others have mentioned that I grabbed from the MySQL home page. I did have to change a memo field to a blob, and change the default "Yes" in an int field to a "1", but that was it. Then a simple mysql -u me -p database < oldaccessdata was all it took to bring the old data over to the world of Real SQL. :)
It sounds like a lot, but in reality it only took me about a week to do the conversion - including the time it took me to learn how to set up MySQL and teach myself the needed SQL (in addition to doing what they actually pay me to do). The little extra effort is well worth it for the signifigant fine-grained access control gains I got from MySQL's grant tables, etc - and from a data access point of view since everything's web-enabled now.
Well, NT's to blame. (Score:1)
I think it _is_ NT's fault, tho, else why reboot? Just kill the process, and a decent OS will reclaim the memory. If your OS doesn't, then it's only doing half of its job!
dbi/dbd makes for good utility (Score:1)
Why make him work so hard? (Score:1)
Re:MySQL - ODBC throughput? (Score:1)
to pump over 14 MB from Access?
Re:You're problem is Access.. not NT. (Score:1)
PHP is the Bomb-diggity (Score:1)
Re:Coverting between access... (Score:1)
hmmm, Access aint relational - business needs v's (Score:1)
Having 2 NT machines may point to badly configured (I can hear the sniggers now )
Problem 2:
database problems Our catalog has around 25,000 items in it. It's held in an Access database right now... around 14 Megs.
been there, done that. At my old work [ringtail.com.au] we had similiar problems. I think it's a bit premature to write NT off for this kind of a problem just yet. It's not the environment (caveat: well configured NT system with min 128MbRAM, Fast disk drives etc..) it's your database engine.
Access is good for playing with and possibly for serious applications a couple of years ago. But for a cat of say 25,000 records and more than a half dozen users Access will bog down under it's own limitations (file sharing database, not relational, mem leaks, your configuration settings, code access methods). I think before you start changing the operating system, code base and more importantly *mindset*, investigate a MS SQL server (or other politically correct db's) license.
If management is balking at the cost... then cost out the hours to rebuild, test and get the system working using a linux/db/scripting approach. I say this because it's as much a business problem as a technical one. Change takes time, time equals cost - the same money to buy a new 'relational' database on your existing code base. With NT SQL Server you have the ability to upsize the DDL, re-insert the data etc...
<rant>
beware of suggestions to change your current setup by those preaching alternative OS's without fully investigating your current problems
changing to Linux for instance is not the immediate problem.
I'm a bit dubious here, and dont get me wrong I use/worship Linux as much as the next nutter...but dont do yourself and your company a diservice until you are 100% sure NT can't handle your business needs and not some *geek-lust* need to install a new OS. </rant>
Linux and e-commerce go way back (Score:1)
"Proving" an OS (Score:1)
A single person sitting at a single system cannot, of course, "prove" the power or stability of any operating system. This scenario does not even rule out the possibility that some other person sitting at the same system might crash every ten minutes.
I say "of course' because this is absolutely obvious to me. The fact that it's not obvious to our AC might indicate that he has only ever been exposed to a single operating system, and does not need much in the way of proof.
Database conversion (Score:1)
1. There are both middleware tools to convert Access databases, and direct imports in the databases you want.
2. It is possible to create a new database from the old one *****if***** you wish to write a program to do the conversion. Your new preferred databases are widely used in the open source community ( MySQL is nice, and solid when properly configured ) and one can convert from Access almost trivially, as the SDK is available for free from M$, showing the exact underlying format of the database.
As others have also noted, Access is rather sloppy in it's data rules, so be careful here.
Another solution is to use an open e-commerce solution and port your data. There was an article on our very own
Just my two cents.
***Unix is user friendly. It's just very particular about who it's friends are***
hmmm, Access aint relational - business needs v's (Score:1)
Having 2 NT machines may point to badly configured (I can hear the sniggers now )
Problem 2: database problems
Our catalog has around 25,000 items in it. It's held in an Access database right now... around 14 Megs.
been there, done that. At my old work [ringtail.com.au] we had similiar problems. I think it's a bit premature to write NT off for this kind of a problem just yet. It's not the environment (caveat: well configured NT system with min 128MbRAM, Fast disk drives etc..) it's your database engine.
Access is good for playing with and possibly for serious applications a couple of years ago. But for a cat of say 25,000 records and more than a half dozen users Access will bog down under it's own limitations (file sharing database, not relational, mem leaks, your configuration settings, code access methods). I think before you start changing the operating system, code base and more importantly *mindset*, investigate a MS SQL server (or other politically correct db's) license.
If management is balking at the cost... then cost out the hours to rebuild, test and get the system working using a linux/db/scripting approach. I say this because it's as much a business problem as a technical one. Change takes time, time equals cost - the same money to buy a new 'relational' database on your existing code base. With NT SQL Server you have the ability to upsize the DDL, re-insert the data etc...
<rant> - sig to noise
beware of suggestions to change your current setup by those preaching alternative OS's without fully investigating your current problems
changing to Linux for instance is not the immediate problem.
I'm a bit dubious about posts just throwing around languages, OS's and db's without working the problem first.. Don't get me wrong, I use/worship Linux as much as the next nutter...but dont do yourself and your company a disservice until you are 100% sure NT can't handle your business needs and not some *geek-lust* need to install a new OS, and code perl
</rant>
Re:Most likely ASP (Score:1)
Re:MDSE (Score:1)
Sybase 11.9.3 (Score:1)
I would hope that more people gave this excellent oppurtunity to test it out a chance in their environments.
(Been doing Sybase administration for almost 4 years now, I have never worked with a more extensible, reliable and powerfull system...)
-Dextius Alphaeus
Re:Suggestions - SSL Webservers (Score:2)
You can find its website here: http://www.covalent.net/ [covalent.net]
Or if you live in a free country, you can use mod_ssl at http://www.modssl.org [modssl.org]
Also, I wouldn't really call it a close race between Postgres and MySQL features. MySQL doesn't plan to do SQL Transactions, for instance, while Postgres does. MySQL, on the other hand, has much friendlier SQL extensions, particularly for date formatting and such. Both have commercial support options.
Use Minivend (Score:1)
I've had excellent luck with this free product.
It takes some effort to pick up, but it's very flexible, and you get a lot of cool stuff without much effort. Like automatic shipping pricing based on zone, credit card encryption, database compatibility, awareness of accessories and properties (like color), quantity pricing, etc.
My catalog only has about 300 items, but it is reputed to work well up into the millions.
The basic concept is that you set it up to generate on the fly pages based on what's in the database. You can also link to it from static product pages if you want (which we do).
The programming is done by writing perl code and sticking it in web pages to be run as the page loads. You can also call static functions that you put in configuration files, where they only have to be parsed once.
The only downside is that while there is about an inch thick stack of documentation, it is very poorly written and hard to understand. Fortunately there is an active mailing list for support.
Good Luck,
-Loopy
MySQL / JServ / Apache (Score:1)
Java Apache Page [apache.org]
This site will point you into the right direction getting everything that you need. Should you need to get a developer I would suggest contacting Web Programmers, Inc. [webprogrammers.net] We have used them for our development and they've done a great job on our up coming project for Ruptime.com [ruptime.com]
MySQL == BAD for eCommerce (Score:1)
MySQL does the barest of bones SQL. There is no replication (pending however) and no transactions (status?). If this guy's got a 25k-item catalogue and is expecting any kind of web traffic and wants data inegrity, he NEEDS transactions. Table locks suck.
I suggest Postgres (maybe, it's what I use but not in a heavy production environment) or something meatier like Informix or Oracle. You do NOT want to be doing anything regarding money or inventory without transactions and row-level locks.
Wait a minute... (Score:1)
But do you want this functionality in the DATABASE (Score:2)
If what you're really doing are transactions, this is fine. Build up sets of updates that are themselves transactions.
Keep in mind that the web server is not the back office; the data should get pushed over to the "heavy duty SQL box" when it comes time to do the accounting for either money or for inventory.
Consider the MySQL database to be an "embedded" database system, intended to support just the web application. Make that robust, and leave the "heavy production environment" stuff for the other server.
After all, you don't want customers outside to be directly hitting your master database, do you? I don't spell that "security."
Re:You're problem is Access.. not NT. (Score:1)
Morons, Access, and ActivePerl (Score:1)
No one's going to take you seriously. Anyone who has abuse but no better solutions to offer can and will safely be ignored. Anyway, the choice of Access may not have been his, he may just have had to pick it up. Don't go away mad
As plentry of people have said, exporting data from an Access database is trivial. If you don't want to get a copy of Access, use ActivePerl (www.activestate.com) with DBI and DBD:ODBC and roll your own. You could perhaps use another DBD for the database you plan to use and do the entire data migration in a single script.
I HAVE THE SOLUTION (for jsp anyway) (Score:1)
I was facing the exact same problem just last month. In my experience, if you want to develop scalable server-side software, and you're using OO technology, you need a layer of database objects to encapsulate all your data access logic. You then build service-oriented objects on top of those. This is very similar to the EJB model (minus containers/transactions/etc), but you can't afford EJB because a good server is >$10k.
But this object layer can prove to be a high maintenance piece of software. Every time a data structure changes, both the object layer and the database scripts have to incorporate the change.
My software is written to read a single specification file, and from that file it generates:
So, yes, the specification file you feed the script can be quite complex, but it's nice to maintain your database AND your object layer from one place.
Well, my software is really just a hack I put together, I wouldn't be surprised at all if many other people have done the same thing already. I dunno.
Anyway, if anyone's interested in using my software, it's still not 100% foolproof, so I'd like to give it some nice stress testing in an open source environment.
Any takers?
- jonathan.
"Every tool is a weapon, if you hold it right." - ani difranco
Re:Been there, done that (Score:1)
1. mySQL is fast and stable we have been running it for 6 months on a system averaging 80 - 100k transactions a week. The one problem is transactions specifically the lack of COMMIT and ROLLBACK. (app has to keep 8-10 tables to keep in sync per transaction). We ended up simulating COMMIT and ROLLBACK with perl plus DBI with no significant hit to performance.
2. Attaching Access frontends to mySQL is trivial. ODBC and away you go. As pointed out data migration, once schemas are migrated, is pretty easy.
3. Schemas can be the problem. There is no easy way (that we have discovered at least) to export schemas. This to date has been a 'by hand job' (excuse the pun
A few final ramblings:
We love linux/apache(mod_perl)/mySql for infrastructure problems. Access is a great little quick dirty tool for building Entry and Reporting screens for our MS wedded customers. (NOTE: we did just recently convince one client to re-equip their 12 person data entry dept with little penquins). We have started to use PostgreSQL for some more complex data problem spaces. The short term results have been very positive.
mitd
Whee! Look at me! (Score:2)
Anyway, regarding picking a Unix-type OS, and database it's obvious it's a complex issue. The machine crashing may be an immediate problem, but there are long-term issues to face. For instance, I have loaded (read loaded as in doing something - i.e., not idle) machines that stay up hundredes of days for an upgrade. That's without reading between the lines, reloading the OS, hacking the configuration or random parts of the OS breaking between upgrades.
People may say that Unix systems require less effort to run, but what it really requires is more knowledge. For instance, the primary webserver I run for an "e-business" is a single Debian machine on a pentium pro 200. Through several Debian upgrades (including libc5 -> libc6) it has always been stable and reliable. No service packs that break half the stuff, no middle of the night crashes, nothing. The amount of administration effort to run the box (which does hundreds of thousands of $$ of business a year) is a few hours a month. The cost of the setup was around $5000.
Fast-forward to my day job at a Fortune 500
There they recently migrated our mail server from a single (1) machine running netscape mail server to a farm of NT servers running Exchange. The Netscape mail server was on a Sun Enterprise and was rock solid. The Exchange servers, on the other hand, are on a weekly reboot schedule. Our Exchange/NT team had done all it could, and came to the conclusion that either the machines could be rebooted every 7 days or crash on their own every 10.
Also of interest is the management capabilities inherent with Unix-based systems vs. Exchange. For instance, on the Netscape mail server, if a user wanted files from his mailbox restored, a few files were restored from backup, and presto! On the Exchange server, the entire mail database has to be rolled back to the state where the files still existed (for *everyone*)
Another item of interest is that when doing the mail server migration, the postmaster box ended up with over 60,000 pieces of mail in its box from warnings. With the server on solaris, I was able to write a quick perl script that would delete the files of specific subject line. The Exchange team's answer to a similar problem (this time with 100,000 emails) was to pull them up in Outlook and delete the messages 10 at a time. Of course, that wasn't possible, as the machine would just freeze due to the insane amount of RAM required to do such a thing (not to mention the time required to do this 10 at a time..) Luckily one of the up & coming unix geeks had a MS background and mentioned that outlook delete filters would do the trick (which they did - but it had to be accomplished from the client side)
Anyway, the moral of the story is that NT server installations as a rule will cost more, require more maintenance and make it difficult for you to fix things when it really counts.
Anyway, no matter what you do this time, I'd reccommend you at least set up an experimental server to do similar things to familarize yourself with a unix-like enironment. And, learn enough perl (or python/zope or php or java...) to put together the kind of web application you'd want. My personal favorite is perl, as CPAN has modules to do just about anything, and it's been invaluable to me as a system administrator and web programmer, but I know people have done very cool things with the others, also. Also, SQL. I'd recommend Postgres and MySQL (whichever fits the job) and, possibly Sybase (but its proprietary nature can be a pain at times). Don't forget about FreeBSD, either. Its scaled-down nature can make administration easier when you only want specific things on a box, and at present, it has some large file advantages over Linux.
Re:Whee! Look at me! (Score:2)
Btw, the "requires more knowledge" part mostly applies to getting started with the OS. It could require less knowledge to get a unix box running stably than it does to get an NT box running stably (and said knowledge is easier to come by) For example, you may have to do a bit of reading and compiling to get things how you want them, but the compiling is documented. Also, at least in the open source realm, mysterious failures tend not to be a problem. In the event something *does* fail, it's usually easy (or at least straightforward) to figure out why it's failing and how to fix it (as opposed to trying 25 different driver version, service pack version and system resource configurations) Even in the Unix world, Netscape Enterprise vs. Apache is a case in point here. When our corporate Netscape Enterprise server mysteriously failed to start one day due to a "sig6 pipe closed" error, it turned out that the search engine's index had gotten corrupted, and disabling the search engine prior to starting the netscape server would fix it. It took a call to netscape support to figure this out, though - running truss (equiv of strace on Linux) on the server didn't help, nor did the system or server logs.
Another thing I think I should mention is that if you put Linux on a webserver, and the application outgrows the machine (or OS), it's trivial to port the application up to a more "enterprise" class OS such as Solaris. At the Fortune 500 I mentioned in my last post, the enterprise webserver was recently moved from a big Irix machine to a big Solaris machine. Since all the in-house apps were coded in perl, everyone's applications didn't require any porting. The same would also be true for any of PHP, Python/Zope (and even java if you're lucky). Of course, you could also design the application with real scalability in mind and, for instance, use 2000 FreeBSD webservers like Microsoft does with Hotmail and let something like Solaris handle the backend mass storage. (Or like deja.com or google.com do with Linux.. or yahoo.com does with FreeBSD)
Perl serial port programming (Score:2)
Using SerialPort and Perl is very flexible and powerful, the package let's you do exactly what you want/need to do. I used it recently to prototype a project, presuming that looping to satisfy reads, timing out, calculating CRCs, and such on the fly with Perl would require a re-write in C once I got it all working. Not a problem. It's fast and flexible, and very robust because Perl makes it easy to do things right.
NT is die-ing ... (Score:1)
If you're NT is constantly die-ing then you probably have a misconfigured item somewhere in your system.
Good configured NT systems are as stable as Unix machines.
Of course, it is MS's own mistake to have so much 'bad' sysadmin's out there. Unix has thousands of capable sysadmin's, they learned the trade in ten years or more of experience. NT admins however often just follow a course, take an NT certified whatever exam and hup, they are promoted to full blown NT sysadmins.
If you compare NT to unix, don't forget to compare their sysadmins as well.
But .. NT is harder to administrate then Unix, so it is quite natural to have less good NT admins then Unix admins
Re:You're problem is Access.. not NT. (Score:1)
I've installed Oracle on NT, Linux and Solaris. Oracle for Linux is riddled with bugs - both 8.0.5 and especially 8i. There are numerous patches and workarounds you must use, or it simply will not work.
I was aghast at the quality level (or the lack thereof) of such an expensive piece of software. Even if the install goes well, you must understand that Oracle, by nature, uses up a LOT of resources. I've just finished installing 8i on a dual 500Mhz with 1gig RAM and raid0/raid5 (total 45gig) system. But once installed, it's faster than hell, and you have a huge amount of tools to do anything you want.
The easiest install/most stable operation is on Solaris, where I was able to get it to work with 128MB RAM and 16gig raid.
Oracle on NT is as much pain as it is unstable. Most of the fault lies in NT I'm sure. But we do use it because some 3rd party tools require NT.
Oracle's nice if you got the money and the resources (no, maybe just money). But, what I like best is just a simple Linux/MySQL combo. This is also faster than a greased pig, but much much much easier than Oracle to implement. I mean I can install Linux in about 15 minutes and MySQL in about 5-10 minutes. Throw in Apache and PHP and I can have a development platform in less than an hour. And despite all the nifty tools Oracle has for storage management and data integrity and backup, I can write my own for MySQL easily.
Re:Suggestions (Score:1)
while I use it only for relativly small dbs (2-5 MB) it works over a modem (33.6k) connection without a hitch (it's naturally slow though)
Don't know about mysql, but your question sounds like postgres would be a very good solution, afterall you seem not to need the fastest database available, more a very reliable one.
This is not to say that mysql isn't reliable, but it doesn't do transaction logging.
Toolkit for putting FoxPro data on the web (Score:1)
If you are going to do anything involving both Visual FoxPro and the internet, you should really check out Web Connection [west-wind.com]. It's a framework that allows you to create HTML pages on the fly directly from VFP code. It's really fast and the toolkit is highly scalable and very well supported.
Visual FoxPro doesn't get much press from Microsoft because they want to push SQL and seat licenses. But people who use it know that VFP is very fast and reliable. If you are going to mess with databases, you should be coding with a database language.
WebConnection runs Egghead's SurplusDirect site (among others [west-wind.com]), which gets up to 55,000 hits an hour during peak bidding times. VFP can handle it, as long as you're not writing crappy and inefficient code.
There is support for XML, MSMQ, COM, sending data using HTTP (with a VFP app on the client side), email generation, PDF doc creation, etc. You can also access other databases from within FoxPro using ODBC.
At work, I developed an e-commerce project that does all the online work for several clients that we support, all running from one base class, all coding done in FoxPro. One of the best things is that you can debug it just like a regular VFP app.
I'm not associated with this company but I have used the products and can tell you it's very powerful. The price is way cheap and a shareware version is available for downloading. Check it out. Note: This is for Windows only, but you are not tied to IIS.
If anyone needs convincing.... (Score:2)
E-commerce on linux/unix systems (Score:1)
Re:NT is die-ing ... (Score:1)
Good configured NT systems are as stable as Unix machines.
Possibly, but unix machines tend to be well-configured by default. Performance optimizations may be possible, but I never heard about a unix box "constanly dying" because of a sloppy set-up.
Re:You're problem is Access.. not NT. (Score:2)
I think it's clear from this thread that Access just isn't designed to run your enterprise. At best it's for non-DBA's to run simple databases for a relatively small number of users. So you're going to want to convert to a full-strength DBMS.
The real problem with converting from Access to a real DBMS like MySQL is not in converting the data itself, but in converting the stuff that's not the data. An MDB file is not just tables of data, but an all-encompassing file that contains tables, queries, forms, reports, macros, and VB modules, etc.
Any good database has rules which prevent invalid data from being entered. Just like any good program has good error handling to prevent any possible user entry from crashing the program. When you get bad data in your database the results are worse because you won't have any indication that your data is bad until it gets REALLY BAD, or painfully obvious. Not a good thing for business.
The problem with Access is that it doesn't have a tool for managing all of these rules. They are hidden, partly in the table definitions, partly on your forms, partly in macros and VB modules, and partly in the "Relationships" view. A business-class DBMS will have a tool to manage all these rules and mechanisms in one place. It will also allow flexible rules. For example, you can relax the rules in order to import a table that you know contains errors, then it can find all the errors, separate them or fix them, and start enforcing the rules for all new data entry. Access just isn't that sophisticated, and what controls it has, as I mentioned, are not all in one place.
The bottom line is that while it's easy to copy the data, it's very difficult to port over all of these hidden mechanisms that control data quality. That would require someone who knows a lot about Access, which I assume from your question that you are not (no offense). Your alternative is to copy the data to your favorite DBMS (MySQL, Oracle, or whatever), and then set up all your data integrity rules from scratch on the new system. That's probably what you'd end up doing anyway even if you could find all the hidden stuff in Access. It's also not a bad thing to do because it will force you to review and reconsider what data you are storing, how it is stored, how everything relates, what your data entry processes are, etc. That process is always a good thing. You would be wise to get help from a DBA.
If you feel confident hacking a DBMS without a DBA, then more power to ya, but you would be wise to proceed cautiously with your company's data. Build the tables and rules in your new DBMS and test it extensively with test data. Make sure it not only works, but also robustly handles errors and bad data entry before you stake your business on it. If you don't believe that it's important to address data integrity issues, take a look at the Y2K problem/response/expense and realize that you can have the same problems with ALL of your data, not just the dates.
If you aren't familiar with what data integrity/quality is, or how to maintain it, try the following links:
Database Central [baobabcomputing.com]
DM Review [dmreview.com]
Best of Luck.
Re:use servlets (Score:2)
One nice thing servlets has going for it is that all of the session management is handled for you. This is a big benefit.
Usually the only people that complain about Java's speed in a server-type environment have never even compaired it. Sort of like all of those Windows users who think Linux sucks but can't hardly spell it.
I've done both CGI (even in C) and Servlet development. And although I would be dragged kicking and screaming into full-time web development, I would still recommend Java servlets in the right situation.
My (stagnating) progeeks.com web site uses servlets and I was able to code the entire thing in one weekend. (Well, the parts I have finished anyway.) The company I actually work for is switching all of their web apps to servlets.
Fun stuff,
-Paul (pspeed@progeeks.com)
Re:Whee! Look at me! (Score:2)
I haven't met any NT admins who can demonstrate the same. It's always a story about how a friend of a friend had a cousin who worked at some important place that had a cream of the crop admin who was able to push NT to a whopping 60 days uptime or somesuch.