BrianBailey

's profile
image
Consultant

Brian Bailey is an independent consultant working in the fields of Electronic System Level (ESL) methodologies and functional verification. Prior to this he was the chief technologist for verification at Mentor Graphics. He is the editor for the EETimes EDA Designline and a contributing editor to EDN. He has published seven books, given talks around the world, chairs international standards committees (is he crazy), and sits on the technical advisory board for several EDA companies. He graduated from Brunel University in England with a first class honours degree in electrical and electronic engineering (yes – he is another Brit, so of course he is crazy).


BrianBailey

's contributions
  • 02.07.2011
  • Are there any FPGA tool developers out there?
  • Max, undisclosed amounts means that they were small amounts and probably did not even return the investors their money back. It usually means that they could not find additional investment money, or another term often used is fire-sale.
  • 12.14.2010
  • What will it take for FPGAs to become as ubiquitous as processors?
  • Some wonderful comments and some of them are issues that I considered myself. For example Frank says that FPGA compete with ASICs and no processors. I see them as a continuum. We need custom logic because processors are not fast enough, or frugal enough with power, but many companies cannot afford the expense of ASICs, so FPGAs are an alternative for small volume products. But why cannot FPGA become more like processors? Part of it is, I think, as Dr DSP and KB3001 point out a matter of standardization and encapsulation. The independence is highly important in that processors enabled independence of task that we sorely need. I find it funny that I am accused of taking the software side, as I am a hardware engineer at heart and a developer of EDA solutions for many years, and now just someone who ponders what is wrong with the industry we work in. Thanks for the comments and keep them coming.
  • 11.16.2010
  • Are low power and FPGA an oxymoron?
  • Hi Frank, I think you are dead on, and in some ways the point I was trying to make: that it is designers who can impact power more than anything else, including implementation technologies. But if you have a good designers who knows what to do to create the best power friendly architecture, then everything else just adds to his ability to create a low power solution and then - yes an ASIC will have lower power than the same implementation in an FPGA solution.
  • 10.04.2010
  • To emulate or prototype?
  • I hope that some of the bullet points would help - at least to start with - determine which you may have more success with. Then as you say there are the goals and outcomes that a team expects to get from it. Costs are very different. Suppose for example that you want to create 20 or 100 prototypes to give to a set of software developers spread around the world. It may not be possible to do that in an economic fashion with an emulator - and besides, the visibility into the hardware may not be an issue. But if you are trying to debug some core driver routines that have some timing issues, then the emulator may be more suitable for that task. Every development team and situation is different, so any vendor who says that my solution is always the right one is lying. The right way to go takes planning and being aware of all of the trade-offs that you are making.
  • 09.21.2010
  • Debugging FPGAs at full speed
  • I agree that this is not a good technique to find a sub-clock timing problem, it is for functional problems.
  • 09.21.2010
  • Debugging FPGAs at full speed
  • You are right, that if you are on the limit in terms of timing, then adding any additional logic can have an impact. There are ways to mitigate this for some designs by buffering locally so that long paths are not added to signals, but this adds more area overhead. The only way to 100% non-intrusively do it is through external monitoring - and even this can change timing by adding the probes. While I cannot know your situation specifics, I would look to see if I could slow down the entire application, say 10%, while performing debug, and thus provide a little more timing leeway.
  • 09.07.2010
  • Are FPGA tools dumb?
  • I would agree with that, but I do not want to call out any specific area as being more advantaged than others. I said in a presentation at DAC, that expect the first people to have a fully operable ESL flow would be the platform chip providers, such as Cypress, TI etc. These have constrained architectures which make it easier to put full flows together. Following on their heels will be flows for FPGAs and the finally for full ASICs. This all has to do with the implementation constraints taking away degrees of freedom and making tool creation easier.
  • 09.07.2010
  • Are FPGA tools dumb?
  • I am not sure I fully understand the question, but let me try. Every implementation target is going to have a library of devices. For FPGAs these are either built out of the array primitives, or have a number of other larger units integrated onto the fabric. For ASICs there is a lot more variety in the building blocks, their sizes, the power profiles etc. Many of the synthesis tools will automatically do the selection to make the best choices for the optimizations you have requested. I am not sure that people would manually make most of these selections today. Now it is possible that the technology, FAB, and physical libraries used may have been selected because of the availability of certain cells, or general characteristics that are to be taken advantage of with a design, but that is a macro decision and not on a cell basis, or often even on a design basis. That may well be a strategic decision. The only thing I can think you are asking is about the characterization process itself. This will often use simulation at certain operating conditions to extract information such as timing, power etc, or these calculations may be don on the fly with certain synthesis tools.
  • 06.23.2010
  • Fundamentals of Mixed-Signal SoC Development
  • Unfortunately most Analog components are highly specialized and just in the same way that we have problems with generalized abstractions for analog models, the same problems exist when trying to make them programmable, such as in an FPGA. There have been programmable op-amps, or other specific analog components, but just think about the general difficulty in taking analog components such as capacitors and resistors and routing them through a programmable fabric, when the interconnect has capacitance and resistance itself. So, unfortunately this is not possible today.
  • 07.18.2007
  • Broadcom versus Qualcomm: Patents and the International Trade Commission
  • I think the ultimate solution is a little more complex than you portray here and is one that is becoming fundamental to the entire chip industry. The problem is that no one company can now produce the totality of the intellectual property that goes into a chip. While things such as processors have been licensed for some time now, a cell phone chip is no longer just a phone, it is a GPS, MP3, email, Internet… device. What I see going on between Qualcomm and Broadcom is a battle between the specific and the general in terms of chip level IP. Qualcomm controls an important piece of specific IP – namely the CDMA modem, without which complete phone chips cannot be made (ignoring the GSM based phones). Broadcom on the other hand has some general IP that applies to all chips. Which has more value? That is the crux of this argument. While Broadcom alleges that Qualcomm uses its general IP, Broadcom also wants access to the Qualcomm IP. Broadcom in effect says that its IP is just as valuable as that of Qualcomm. I believe the outcome of this will have a fundamental impact on the entire IP and chip industry.