PHP Transparent Database Access layer – PHP Object-Relational Mapping (PHP ORM)

The combination of Relational Data (database) and Object Oriented Programming (OOP) is not a match made in heaven. The way we work with objects is totally different than the way we access Data. The problem is that in most projects you need to access data and write Object Oriented Classes to display and manipulate that data.
Continue reading

Add your blog to the World Blog Map

To add your blog please fill this form:


Your Name:



Blog Name:



Blog URL:



Blog (short) description:



Blog image URL (small – 64X64) (optional):



Blogger email (optional):



Coordinates:

(Tip- click on map to automatically set your Coordinates)




Put Your Blog on The Map

Thanks, your submission was send to one of our moderetors

‘;
}

?>

World’s Blog Map

Want to put your blog on the map? Ever wondered where does the blog your reading come from? Where are most Bloggers Located? How would a World Blog Map look like?

Here is the first world wide map of blog sites from all over the world.
Click on the marker to find out more about the blog.

To submit your blog to the map click here.



Click here to submit your blog to the map.

Feel free to view source and see how this is done.

Open source community should employ Microsoft’s business strategies

In one of the press conferences I attended 6 months ago, Steve Bulmer was asked “Why can’t Microsoft be cool like Apple?”

His reply was simple and to the point:

“I rather be popular than cool”

He further explained that Microsoft business strategy is to try to reach EVERYONE and not only the cool guys.

I suggest that we, as the open source community, employ the same logic:

“We rather be popular than the good and righteous”

Let me explain why we should be popular rather than right.

The baseline assumptions of this article

  1. Microsoft is popular – at the time this article is written, Microsoft has a large market share in many IT and retail industries – the company must be doing something right.
  2. Open source is less popular – at the time this article is written, most open source products do not have the market share their makers have wished for – so there is still something we need to do
  3. One of our targets as open source community is to become more popular – growing popularity and market share will drive open source prosperity in many aspects

The secrets of Microsoft’s popularity and market share

These are all well known and published secrets:

  1. Microsoft focuses on creating mass-market products – if it is not a 1 billion dollar in revenue you can forget about it.
  2. Microsoft excels at creating an echo system of partners and independent software vendors (ISV)- it is estimated that Microsoft eco system earns 9$ for every dollar Microsoft earns. Think about it!
  3. Microsoft spends millions of dollars on user experience – my Ubuntu might provide a great user experience but to fix things I sometimes need to go to command line, it’s been a very long while since I had to go to the command line to fix something in XP. Anyway, my point is that Microsoft puts great emphasis on how it looks.

Why isn’t open source as popular as it should be?

As always, it is mostly a matter of perception:

End users: ‘It is too hard’ ‘I am not a techi’ ‘There is no proper support’ ‘It does not come out of the box’

From a consumer perspective, most open source products are considered a niche and the realm of the super-technical (AKA geeks). I am not talking about popular products like Firefox, I am talking about the less popular majority of the open source offering. While most consumers find the mistaken notion of open source as free in the “free beer” perspective, they are still hesitant about support, user experience and simplicity. People think that some problems in open source products require command line, and let’s admit, VI is still not the preferred way of the masses to fix problems. Additionally, most consumers do not install anything by themselves, they like their software to come out of the box, preinstalled and ready to go.

Partners and ISVs: ‘Too complex’ ‘High risk’ ‘No commercial support or liability’ ‘Contagious license, I have to be open source as well’

From a partner perspective, open source sometimes seems as a risk to work with. Some commercial independent software vendors are afraid of lack of support and accountability. Simple, UI-driven integration between open source products, is still something rare, and human documentation is sometimes lacking (not talking about geek forums). Additionally, some cooperate lawyers are more afraid of open source licenses than the bubonic plague. One should not under estimate this last point, lawyers stop any development and partnering program in a blink of an eye, no matter how affordable and profitable it might be, it is their job to do so.

The first steps towards growing open source market share.

  1. Use the 80/20 – Pareto law
  2. Create mass market products – 20% of the applications drive 80% of the adoption. Firefox is an example of a great adoption driver.

  3. Focus simplicity – KISS lawKeep the project easy to use – aim for the lowest common denominator and work up from there.Always ask yourself – if I wasn’t the subject matter expert, would I find it easy and simple?

    Always ask yourself – can I achieve this functionality in a simpler, clearer way?

  4. Focus on UXUser experience is under estimated by most developers but please believe me – user experience is THE KEY for the success of 90% of any software project
  5. Don’t use GPLUse a commercial friendly open source license like Apache, MIT or MPL. Read the next section to learn why you should do that.

Why I think GPL is a poison pill for open source market share.

I was working in a big Telco company at the time when I got this kind of email -“All development over open source products or libraries should be stopped. A list of all open source licenses should be sent to the company’s law firm”

GPL is considered a dangerous license. It does not matter at all if that is true or not (I think it is not dangerous)- it is a matter of perception and not truth. Changing perception is extremely hard. Public perception sends people to jail, and makes stock markets fall. Even Microsoft found it hard to change perception regarding Vista.

GPL is considered contagious sticky open source license, it is also considered dangerous because people think it will make their software GPL without their consent, but worst than that, it is considered hard to understand and untested in court. If I was a lawyer, I would consider it a risk and lawyers, even more than most people, try to avoid risks as much as they can.

Some might say that GPL has done the open source community a lot of good, that it is the right way to secure the openness of the code, but look at our basic assumptions – right is OUT, market share is IN.

We as an open source community should drive software released under MPL, Apache, and any license which is considered safe and simple, and hope that the GPL perception slowly fade away.

Will Microsoft be hurt from this new approach?

No, it will thrive. Microsoft, like all super-big companies, is driven by revenues and market growth. If the open source will be successful and drive the market to grow, it will also grow Microsoft’s market. In simple terms if the market grows as a result of these recommendations, and it will, everybody will benefit from this- Google, IBM, Microsoft and all the rest of the big and small software and hardware vendors will cheer the open source community.

There is hope at the end of the tunnel.

We have made great progress; some open source products are now very popular both in the server side and the client side. If we employ a little more business savvy strategies and generate strong partners and echo system we will grow the entire software market. We will not only drive the open source business, but also benefit others and even help fight recession.

One big happy family, Right?

Effective Development Environments – Development, Test, Staging/Pre-prod and Production Environments.

The following happens in many software projects –
At start, it seems you only need one environment for your web application, well, at most two:
One development environment (AKA your PC) and one server.

But as time pass, you find you need additional environments:
The clients might want their own testing environment, sometimes you need to have a pre-production environment or a staging environment, so business managers can approve the ongoing content as well as look & feel.

Do you really need these environments? What are these environment good for?

Here is a short description of some of the more popular environments and their purpose.
Continue reading

How much should a web site really cost?

Lately I have been asked by many people how much should they pay for a personal or small business website. It seems that these are good times to get some extra cash from a web initiative.

The problem is that, for most people, building and hosting a website is somewhat of a mystery. And when people treat something as a mystery they tend to over complicate it and over pay for it. Some people I talked to spent 20-100$ on simple basic hosting alone! That is, in most cases, more than what they need to pay.
Continue reading

How to do cross browser sanity testing in less than 5 minutes for free

Cross browser, cross operating system testing is a costly and tedious task. Most often we test our web application on the Internet Explorer and Fire Fox installed on our machine and hope for the best. In big projects with rigid compliance requirements we test the major operating systems (Apple, Windows and sometimes a popular Linux distribution) with the major Browsers. The underline assumption is that this covers 99.x% of the population and we are happy with that. But what if we could, without additional cost, see how our web site looks like in many operating systems and multiple browsers?
Continue reading

Shared Hosting or Dedicated Hosting? Now it is easier to choose.

When you want to host your website with a hosting proivder, one of the first choices you’ll need to make is whether to go with a dedicated server or shared hosting.

This is the key decision that this choozza helped me make. Choozza takes a different approach to this question; I defined my priorities and got the decision which was fine tuned for me.

You will make the decision based on these criteria:

1) Big Websites Fit – Does this hosting option fit a big website, with a large volume of traffic?

2) Control – How much control do you have over the server?

3) Cost – How expensive is each of the options?

4) IP Address Issues – Will your website have its own IP address?

5) Performance Implications – What type of performance can you expect to get from the hosting? What can affect this performance?

6) Security – What are the security implications of going this way or the other?

7) Small Websites Fit – Does this hosting option fit a small, low-to-average traffic website with no special needs (e.g. a blog).

I filled this choozza for one of my small web sites and the answer was Shared Hosting. The details provided really justified my decision.

10 things every software architect should consider (AKA – 10 key architectural concepts)

After a session I gave about Scalability in Wellington NZ, one of developers asked me what are the things software architect should consider. I have gathered and compiled this list:

1. Security

Application security encompasses measures taken throughout the application’s life-cycle to prevent exceptions in the security policy of an application or the underlying system (vulnerabilities) through flaws in the design, development, deployment, upgradation, or maintenance of the application. [1]

2. Reliability / Consistency

Data consistency summarizes the validity, accuracy, usability and integrity of related data between applications and across the IT enterprise. This ensures that each user observes a consistent view of the data, including visible changes made by the user’s own transactions and transactions of other users or processes. Data Consistency problems may arise at any time but are frequently introduced during or following recovery situations when backup copies of the data are used in place of the original data. [2]

3. Scalability

Scalability is a desirable property of a system, a network, or a process, which indicates its ability to either handle growing amounts of work in a graceful manner, or to be readily enlarged. [3]

4. High Availability

High availability is a system design protocol and associated implementation that ensures a certain absolute degree of operational continuity during a given measurement period.

Availability refers to the ability of the user community to access the system, whether to submit new work, update or alter existing work, or collect the results of previous work. If a user cannot access the system, it is said to be unavailable. Generally, the term downtime is used to refer to periods when a system is unavailable. [4]

5. Interoperability / integration

Interoperability is a property referring to the ability of diverse systems and organizations to work together (inter-operate). With respect to software, the term interoperability is used to describe the capability of different programs to exchange data via a common set of exchange formats, to read and write the same file formats, and to use the same protocols. (The ability to execute the same binary code on different processor platforms is ‘not’ contemplated by the definition of interoperability.) The lack of interoperability can be a consequence of a lack of attention to standardization during the design of a program. Indeed, interoperability is not taken for granted in the non-standards-based portion of the computing world. [5]

6. Maintainability

In software engineering, the ease with which a software product can be modified in order to:

* correct defects

* meet new requirements

* make future maintenance easier, or

* cope with a changed environment;

[6]

7. Recovery / DR

Disaster recovery planning is a subset of a larger process known as business continuity planning and should include planning for resumption of applications, data, hardware, communications (such as networking) and other IT infrastructure. A business continuity plan (BCP) includes planning for non-IT related aspects such as key personnel, facilities, crisis communication and reputation protection, and should refer to the disaster recovery plan (DRP) for IT related infrastructure recovery / continuity. [7]

8. Performance

Determine how fast some aspect of a system performs under a particular workload. It can also serve to validate and verify other quality attributes of the system, such as scalability, reliability and resource usage. Performance testing is a subset of Performance engineering, an emerging computer science practice which strives to build performance into the design and architecture of a system, prior to the onset of actual coding effort. [8]

9. Standards/ Compliance

Software standard essentially is certain agreed to terms, concepts and techniques by software creators so that many different software can understand each other.

For instance, HTML, TCP/IP, SMTP, POP and FTP are software standards that basically all software designers must adhere to if the software decides to interface with these standards. For instance in order for a email to be read from Yahoo! Mail that is sent from a Microsoft Outlook software application and vice versa, the Microsoft Outlook needs to send the email using for instance the SMTP (Simple Mail Transfer Protocol) protocol (standard) and Yahoo! Mail receives it also through SMTP reader and displays the email. Without a standardized technique to send an email from Outlook to Yahoo! Mail, they can’t be able to accurately display emails sent between the two entities. Specifically, all emails essentially have “from,” “to,” “subject,” and “message” fields and those are the standard in which all emails should be designed and handled. [9]

10. User experience

A newly added member – User experience design is a subset of the field of experience design which pertains to the creation of the architecture and interaction models which impact a user’s perception of a device or system. The scope of the field is directed at affecting “all aspects of the user’s interaction with the product: how it is perceived, learned, and used. [10]

Seems about right… What do you think?